Overview
This node enables interaction with Databricks AI models for text generation and chat-based applications. It is designed to send conversational messages to a specified Databricks model and receive generated responses, making it useful for building AI assistants, chatbots, or any workflow requiring natural language understanding and generation.
Common scenarios include:
- Creating customer support chatbots that respond intelligently to user queries.
- Generating content or suggestions based on user input.
- Automating conversational workflows where AI-generated replies are needed.
For example, you can configure the node to use a specific Databricks model, provide a system message defining the AI's behavior (e.g., "You are an AI assistant"), and send user messages to get relevant AI-generated replies.
Properties
| Name | Meaning |
|---|---|
| Model Name | The name of the Databricks AI model to use for generating responses. |
| System Message | Defines the AI's behavior or persona; sets the context for the conversation. |
| User Message | The message or query sent by the user to the AI model. |
| Options | Collection of optional parameters: |
| - Maximum Length | Maximum number of tokens the AI should generate in its response (default 256). |
| - Temperature | Controls randomness in output; 0 = deterministic, 2 = very random (default 0.7). |
| - Top P | Alternative sampling method controlling diversity; ranges from 0 to 1 (default 1). |
| - Return Full Response | If true, returns the entire API response instead of just the generated text (default false). |
Output
The node outputs JSON data structured as follows:
If Return Full Response is
false(default):{ "content": "AI-generated reply text", "role": "assistant", "input": { "systemMessage": "...", "userMessage": "..." } }This contains the AI's generated message under
content, the role as"assistant", and echoes back the input messages.If Return Full Response is
true:
The full raw API response from Databricks is returned underjson, along with the original input messages, allowing access to additional metadata or details provided by the API.
The node does not output binary data.
Dependencies
- Requires an active Databricks AI API endpoint with appropriate credentials.
- Needs configuration of an API authentication token and host URL for the Databricks service.
- The node uses HTTP POST requests to the Databricks serving endpoint corresponding to the selected model.
Troubleshooting
- No credentials found error: Ensure that the required API key credential for Databricks is configured correctly in n8n.
- API request failures: Check network connectivity, validity of the API token, and that the model name matches an existing deployed model on Databricks.
- Empty or missing response content: Verify that the system and user messages are properly set and that the model supports chat completions.
- Unexpected errors: The node logs request and response data to the console for debugging; review these logs to identify issues.
Links and References
- Databricks AI documentation (for model deployment and API usage)
- n8n Documentation (for general node usage and credential setup)