izz.ONE icon

izz.ONE

Fetch data from izz.ONE

Overview

This node sends a message to the izz.ONE chat service, allowing users to initiate or continue conversations with customizable parameters. It supports selecting datasources, agents, and LLM models, as well as setting system messages, behavior styles, tags, and streaming options. This is useful for integrating AI-driven chat interactions into workflows, such as customer support, automated assistants, or conversational AI applications.

Use Case Examples

  1. Sending a message to start a new chat with a specific datasource and agent.
  2. Continuing an existing chat by providing the chat ID and a new message.
  3. Streaming the chat response in real-time for interactive applications.

Properties

Name Meaning
Message The message text to send in the chat.
Datasources Select one or more datasources to use for the chat. If none selected, automatic datasource selection is applied.
Agent Select an agent to use for the chat. If not selected, no agent is used.
LLM Model Select a specific large language model (LLM) to use for the chat. If not selected, automatic model selection is applied.
System Message An optional custom system message to influence the chat context.
LLM Behavior Select one or more behavior styles for the LLM, such as conversational, friendly, informal, creative, etc.
Tags Optional comma-separated tags to categorize the message.
Streaming If enabled, the response will be streamed back in real-time; otherwise, the full response is returned at once.

Output

JSON

  • msg - The response message from the izz.ONE chat service, including chat data and metadata.

Dependencies

  • Requires an API key credential for izz.ONE API authentication.

Troubleshooting

  • Ensure the API key credential is correctly configured and has necessary permissions.
  • Verify that the chat ID is provided when continuing a chat; otherwise, a new chat is created.
  • Check that selected datasources, agents, and LLM models are valid and available.
  • If streaming is enabled, ensure the client supports handling streamed responses.

Links

Discussion