Overview
This node sends messages to a Kafka topic using SSL encryption. It is designed for securely publishing data streams or events to Kafka queues, which are widely used in real-time data pipelines, event-driven architectures, and distributed systems.
Common scenarios include:
- Streaming JSON data from other n8n nodes into Kafka topics.
- Sending custom string messages to Kafka for downstream processing.
- Using Confluent Schema Registry to enforce message schemas.
- Adding optional message keys and headers for Kafka message routing and metadata.
Practical example:
- An IoT sensor data pipeline where sensor readings are collected by n8n and published as JSON messages to a Kafka topic for real-time analytics.
- Publishing user activity logs with specific keys and headers to Kafka for targeted consumption by microservices.
Properties
| Name | Meaning |
|---|---|
| Topic | Name of the Kafka topic (queue) to publish messages to. |
| Send Input Data | Whether to send the incoming node input data as JSON to Kafka. |
| Message | The message string to send if not sending input data. |
| JSON Parameters | Whether header parameters are provided as JSON instead of UI fields. |
| Use Schema Registry | Whether to use Confluent Schema Registry to encode messages according to a schema. |
| Schema Registry URL | URL of the Confluent Schema Registry service (required if using schema registry). |
| Use Key | Whether to use a message key for Kafka messages. |
| Key | The message key string (required if "Use Key" is true). |
| Event Name | Namespace and name of the schema in the schema registry, e.g., "namespace.name" (required if using schema registry). |
| Headers | Key-value pairs of headers to add to the Kafka message (UI input, shown if JSON Parameters is false). |
| Headers (JSON) | Header parameters as a flat JSON object (shown if JSON Parameters is true). |
| Options - Acks | Whether the producer waits for acknowledgement from all replicas before considering the message sent. |
| Options - Compression | Whether to compress the message payload using GZIP codec. |
| Options - Timeout | Time in milliseconds to wait for a response from Kafka when sending messages. |
Output
The node outputs an array of JSON objects representing the result of the Kafka send operation. Each object typically contains metadata about the message delivery status, such as partition and offset information.
If no messages are sent, it returns an array with a single object { success: true } indicating successful execution without errors.
Binary data output is not supported by this node.
Dependencies
- Requires a Kafka cluster accessible via SSL.
- Requires an API key credential configured with connection details including brokers, client ID, SSL certificates/keys, and optionally username/password for SASL authentication.
- Optionally requires access to a Confluent Schema Registry service if schema validation and encoding is enabled.
- Uses the
kafkajslibrary for Kafka communication. - Uses the
@kafkajs/confluent-schema-registrylibrary for schema registry integration.
Troubleshooting
- Authentication errors: Ensure that the API key credential has correct broker addresses, SSL certificates, and authentication credentials if SASL is enabled.
- Schema Registry errors: Verify the schema registry URL and event name are correct. The node throws an error if it cannot fetch or encode the schema.
- Invalid headers JSON: If using JSON parameters for headers, ensure the JSON is valid and represents a flat object.
- Timeouts: Adjust the timeout option if Kafka responses take longer than expected.
- Message key required but missing: If "Use Key" is enabled, provide a non-empty key value.
- Empty topic: The topic name must be specified; otherwise, Kafka will reject the message.