Gravity Claude

Streams responses from Claude via AWS Bedrock

Overview

This node integrates with the Claude AI models provided via AWS Bedrock, enabling users to send conversational messages and receive streamed or completed AI-generated responses. It is designed for scenarios where interactive AI assistance or content generation is needed, such as chatbots, virtual assistants, or automated content creation workflows.

The node supports streaming partial responses as they are generated, which is useful for real-time applications, and also provides a final complete response. Additionally, it can integrate with a tool provider system (MCP) to enhance AI capabilities by allowing Claude to use external tools during the conversation.

Practical examples:

  • Building a customer support chatbot that streams answers as Claude generates them.
  • Generating creative writing or brainstorming ideas with adjustable randomness and token limits.
  • Integrating AI-powered tools dynamically selected or specified by the user to augment responses.

Properties

Name Meaning
Model The Claude model to use. Options: "Claude 3.5 Sonnet 2", "Claude 3.5 Haiku".
System Prompt A prompt string guiding Claude's behavior throughout the conversation.
Message The message to send to Claude in JSON format, specifying role and content.
Temperature Controls randomness in the response, from 0 (deterministic) to 1 (very random).
Max Tokens Maximum number of tokens Claude will generate in the response.
Use Gravity MCP provider Whether to send output to the MCP provider; required if integrating with MCP Client.
Enable Any Tool Allows Claude to use any tool without restrictions when set to true.
Tool Choice Specify a particular tool for Claude to use; leave empty for automatic selection.
Advanced Options Collection of advanced settings:
- Message State Override the message state for all streamed chunks. Options include Default (auto), Thinking, Responding, Active, Waiting, Complete.
- Progress Message Optional progress message to accompany the message state during streaming.

Output

The node produces three outputs:

  1. Stream: An array of streamed chunks representing partial responses from Claude as they are generated. Each chunk includes formatted data about the current state of the response, optionally including progress messages and message states.

  2. Result: A single-item array containing the final complete response object. This includes the full text generated by Claude, metadata such as the number of chunks received, the original messages sent, temperature, and max tokens used.

  3. MCP: Formatted output intended for the MCP provider integration, encapsulating the full response and relevant parameters.

If an error occurs during execution, all outputs return an error object describing the failure.

Dependencies

  • Requires valid AWS credentials with access to AWS Bedrock services.
  • Environment variable AWS_REGION should be set (defaults to us-east-1 if not).
  • Integration with the MCP provider requires enabling the corresponding option and proper MCP client setup.
  • Uses the @gravityai-dev/gravity-server package and local BedrockService module for request handling and formatting.

Troubleshooting

  • Common issues:

    • Missing or invalid AWS credentials will cause authentication failures.
    • Incorrectly formatted JSON in the "Message" property may lead to parsing errors.
    • Setting an unsupported model name will result in request errors.
    • If streaming does not work, verify network connectivity and AWS service availability.
  • Error messages:

    • Authentication errors typically indicate missing or incorrect API keys.
    • JSON parse errors suggest malformed input in the "Message" field.
    • Timeout or network errors may require retrying or checking AWS region configuration.

To resolve these, ensure credentials are correctly configured in n8n, validate JSON inputs, and confirm environment variables like AWS_REGION are properly set.

Links and References

Discussion