Overview
This node, named "Flo Agentic Workflows," allows users to interact with predefined workflows hosted on a Llama Deploy server. It sends a user-defined message to a selected workflow and optionally evaluates the result returned by that workflow. This is useful for automating interactions with AI-driven or agentic workflows such as compliance assistants or incident management systems.
Common scenarios include:
- Sending queries or commands to an AI-powered policy and compliance assistant to get guidance or check compliance.
- Reporting incidents or requesting actions from an incident management workflow.
- Automating conversational workflows where the input message drives the workflow logic and the output is processed further in n8n.
Practical example:
A user inputs a message describing a compliance question, selects the "Policy and Compliance Assistant" workflow, and receives a structured response that can be used downstream in the automation.
Properties
| Name | Meaning |
|---|---|
| Deployment Name | The name of the deployment on the Llama Deploy server. Defaults to "StateMachines". |
| Workflow Name | Selects which workflow to send the message to. Options: - Policy and Compliance Assistant - Incident Management |
| Evaluate Result | Boolean flag indicating whether to evaluate the result of the workflow ("True" or "False"). |
| User Message | The message text sent to the Llama Deploy server. Can use expressions referencing input data (e.g., {{ $json.chatInput }}). Required field. |
Output
The node outputs an array of JSON objects, each corresponding to one input item processed. Each output JSON contains:
response: The main content returned by the workflow after processing the input message. This is typically a string or structured data parsed from the workflow's result.- In case of failure, the output JSON includes:
success: falseerror: A brief error message ("Failed to send message")details: Detailed error information including the original workflow name and input parameters.
The node does not output binary data.
Dependencies
- Requires connection to a Llama Deploy server via an API key credential.
- Needs configuration of API server URL and control plane URL through credentials.
- The node uses an internal client (
LlamaDeployClient) to manage sessions and run workflows on the server. - Timeout for workflow execution can be configured via credentials (default 120 seconds).
Troubleshooting
Common issues:
- Failure to connect to the Llama Deploy server due to incorrect API URLs or missing/invalid API key credential.
- Workflow execution timeout if the server takes too long to respond.
- Malformed or unexpected responses from the workflow causing JSON parsing errors.
Error messages:
"Failed to send message": Indicates the node could not successfully invoke the workflow. Check network connectivity, API credentials, and workflow availability."Failed to clean up session"(logged to console): Indicates an issue deleting the session after execution; usually non-critical but may indicate resource leaks on the server side.
Resolutions:
- Verify API credentials and URLs are correctly set.
- Ensure the selected workflow exists and is active on the Llama Deploy server.
- Increase timeout if workflows take longer to process.
- Inspect detailed error messages in the output for clues.
Links and References
- Llama Deploy Documentation (hypothetical link)
- n8n documentation on Creating Custom Nodes
- General info on agentic workflows and AI orchestration platforms
