Kafka SSL Trigger icon

Kafka SSL Trigger

Consume messages from a Kafka topic

Overview

This node is a Kafka SSL Trigger designed to consume messages from a specified Kafka topic securely using SSL. It connects to a Kafka cluster, subscribes to a topic, and listens for incoming messages in real-time. The node supports advanced options such as using the Confluent Schema Registry for message decoding, JSON parsing of messages, and controlling consumer group behavior.

Common scenarios where this node is beneficial:

  • Integrating Kafka event streams into n8n workflows for real-time data processing.
  • Consuming messages from Kafka topics that require SSL encryption and authentication.
  • Processing Avro or other schema-based messages via Confluent Schema Registry.
  • Handling ordered or parallel message processing depending on workflow needs.

Practical examples:

  • Triggering a workflow whenever a new order event is published to a Kafka topic.
  • Consuming sensor data streams securely from an IoT Kafka cluster.
  • Decoding Avro-encoded messages using a schema registry before further processing.

Properties

Name Meaning
Topic Name of the Kafka topic (queue) to consume messages from.
Group ID Identifier for the Kafka consumer group. Consumers with the same group ID share the load of reading from the topic partitions.
Use Schema Registry Whether to use Confluent Schema Registry to decode messages.
Schema Registry URL URL of the Confluent Schema Registry service (required if "Use Schema Registry" is enabled).
Options Collection of additional configuration options:
- Allow Topic Creation Whether to allow sending messages to a topic that does not yet exist.
- Auto Commit Threshold Number of messages after which the consumer commits offsets automatically.
- Auto Commit Interval Time interval (in milliseconds) after which the consumer commits offsets automatically.
- Heartbeat Interval Interval (in milliseconds) for heartbeats to keep the consumer session active; must be less than Session Timeout.
- Max Number of Requests Maximum number of unacknowledged requests the client will send on a single connection.
- Read Messages From Beginning Whether to read messages starting from the beginning of the topic or only new messages.
- JSON Parse Message Whether to attempt parsing the message value as JSON.
- Parallel Processing Whether to process messages in parallel or sequentially to maintain order.
- Only Message If JSON parsing is enabled, whether to return only the parsed message content instead of the full object with metadata.
- Return Headers Whether to include Kafka message headers in the output.
- Session Timeout Time (in milliseconds) to wait for a response before considering the session timed out.

Output

The node outputs an array of JSON objects representing consumed Kafka messages. Each output item contains:

  • message: The message payload, either as a string or parsed JSON object depending on settings.
  • topic: The Kafka topic name from which the message was received.
  • headers (optional): An object containing message headers if the "Return Headers" option is enabled.

If "Only Message" is enabled along with JSON parsing, the output will contain just the parsed message content without additional metadata.

The node does not output binary data.

Dependencies

  • Requires a Kafka cluster accessible over SSL.
  • Requires valid SSL credentials configured in n8n for secure connection.
  • Optionally requires access to a Confluent Schema Registry service if schema decoding is enabled.
  • Needs appropriate Kafka consumer group configuration.
  • Requires an API key credential or username/password for Kafka SASL authentication if enabled.

Troubleshooting

  • Authentication errors: Ensure SSL certificates and/or SASL credentials are correctly configured in n8n credentials.
  • Connection failures: Verify broker URLs, network connectivity, and SSL settings.
  • Schema decoding errors: Confirm the Schema Registry URL is correct and accessible; ensure schemas exist for the topic's messages.
  • JSON parse errors: If "JSON Parse Message" is enabled but messages are not valid JSON, parsing will fail silently and return raw strings.
  • Message ordering issues: If strict message order is required, disable "Parallel Processing" to process messages sequentially.
  • Offset commit problems: Adjust "Auto Commit Threshold" and "Auto Commit Interval" to control offset commits and avoid duplicate processing.

Links and References

Discussion