Package Information
Available Nodes
Documentation
n8n-nodes-streaming-http-request
A custom n8n community node that enables HTTP requests with real-time streaming support. Perfect for proxying streaming APIs, handling Server-Sent Events (SSE), or building real-time data pipelines.
🎯 Purpose
This node allows you to make HTTP requests and stream responses in real-time to webhook consumers. It's designed for scenarios where you need to:
- Proxy streaming APIs (OpenAI, Anthropic, etc.) through n8n workflows
- Forward AI assistant responses in real-time to end users
- Chain multiple streaming endpoints while preserving original metadata
- Handle Server-Sent Events (SSE) and newline-delimited JSON (JSONL) streams
- Build real-time data pipelines with incremental processing
✨ Features
Streaming Modes
Proxy Mode (
Proxy Streaming Chunksenabled)- Transparently forwards JSONL streaming envelopes from remote endpoints
- Preserves original node metadata (nodeId, nodeName, timestamps)
- Perfect for chaining n8n workflows or proxying streaming APIs
- Parses newline-delimited JSON and validates each chunk
Wrap Mode (
Proxy Streaming Chunksdisabled, default)- Wraps raw response chunks in new streaming envelopes
- Adds current node's metadata to each chunk
- Suitable for consuming non-streaming APIs in streaming workflows
Request Configuration
- HTTP Methods: GET, POST, PUT, PATCH, DELETE, HEAD
- Query Parameters: Key-value pairs for URL query strings
- Headers: Custom HTTP headers
- Body: JSON or raw text body for POST/PUT/PATCH requests
- Authentication: Supports all standard n8n authentication methods
Response Handling
- Autodetect: Automatically determines response format (JSON, text, or binary)
- JSON: Parses and returns JSON responses
- Text: Returns plain text responses
- File (Binary): Handles binary file downloads
📦 Installation
Via n8n Community Nodes
- Go to Settings > Community Nodes in your n8n instance
- Click Install and enter:
n8n-nodes-streaming-http-request - Click Install and restart n8n
Manual Installation
cd ~/.n8n/nodes
npm install n8n-nodes-streaming-http-request
🚀 Usage
Basic Streaming Setup
Add a Webhook node to your workflow
- Set Response Mode to
Using Respond to Webhook Node
- Set Response Mode to
Add HTTP Request (Streaming) node
- Configure URL and method
- Enable Enable Streaming in Options
Add a Respond to Webhook node
- Enable Enable Streaming in Options
- Connect it after your HTTP Request (Streaming) node
Example 1: Proxy OpenAI Streaming
Webhook (streaming)
→ HTTP Request (Streaming) [POST to OpenAI API, streaming enabled]
→ Respond to Webhook (streaming)
Configuration:
- URL:
https://api.openai.com/v1/chat/completions - Method:
POST - Body: JSON with
stream: true - Options:
Enable Streaming = true,Proxy Streaming Chunks = false
Example 2: Chain Multiple n8n Workflows
Webhook A (streaming)
→ HTTP Request (Streaming) [POST to Webhook B, proxy mode]
→ Respond to Webhook (streaming)
Webhook B (on another workflow)
→ AI Agent/Processing
→ Respond to Webhook (streaming)
Configuration:
- URL:
https://your-n8n.com/webhook/workflow-b - Method:
POST - Options:
Enable Streaming = true,Proxy Streaming Chunks = true← Key setting!
This preserves the original node names and IDs from Workflow B's response.
Example 3: Non-Streaming Mode
Use it as a regular HTTP Request node with enhanced response handling:
Manual Trigger
→ HTTP Request (Streaming) [GET request, streaming disabled]
→ Process Response
Configuration:
- URL:
https://api.example.com/data - Method:
GET - Options:
Enable Streaming = false,Response Format = json
⚙️ Options
| Option | Type | Default | Description |
|---|---|---|---|
| Enable Streaming | Boolean | true |
Enable real-time streaming to webhook response |
| Proxy Streaming Chunks | Boolean | false |
Forward JSONL envelopes with original metadata (for proxying n8n workflows) |
| Response Format | Select | autodetect |
How to parse the response: autodetect, json, text, or file |
| Full Response | Boolean | false |
Return complete response including headers and status code |
| Output Property Name | String | data |
Property name for response data in output |
| Timeout | Number | 300000 |
Request timeout in milliseconds |
🔧 Technical Details
Streaming Envelope Format
When streaming is enabled, data is sent as newline-delimited JSON (JSONL):
{"type":"begin","metadata":{"nodeId":"abc123","nodeName":"My Node","itemIndex":0,"runIndex":0,"timestamp":1234567890}}
{"type":"item","content":"Hello","metadata":{"nodeId":"abc123","nodeName":"My Node","itemIndex":0,"runIndex":0,"timestamp":1234567891}}
{"type":"item","content":" World","metadata":{"nodeId":"abc123","nodeName":"My Node","itemIndex":0,"runIndex":0,"timestamp":1234567892}}
{"type":"end","metadata":{"nodeId":"abc123","nodeName":"My Node","itemIndex":0,"runIndex":0,"timestamp":1234567893}}
Proxy Mode (Metadata Preservation)
When Proxy Streaming Chunks is enabled:
- Buffers incoming data and splits by newlines
- Parses each complete line as JSON
- Validates it's a proper streaming envelope (
type+metadata) - Forwards via hooks with original metadata intact
- Falls back to standard
sendChunkif hooks unavailable
This allows transparent proxying of streaming responses from other n8n workflows without modifying node metadata.
🛠️ Development
Build from Source
git clone <repository-url>
cd streaming-http-request-node
npm install
npm run build
Link for Local Testing
npm link
cd <your-n8n-directory>
npm link n8n-nodes-streaming-http-request
Restart n8n after linking.
📝 License
MIT
🤝 Contributing
Contributions are welcome! Please open an issue or submit a pull request.
💡 Tips
- Use Proxy Mode when forwarding responses from another n8n streaming workflow
- Use Wrap Mode when consuming external streaming APIs (OpenAI, Anthropic, etc.)
- For large responses, consider increasing the timeout value
- Enable Full Response if you need access to headers or status codes
⚠️ Troubleshonings
Problem: Seeing "undefined" in output
Solution: Ensure the remote endpoint is returning JSONL format when using Proxy Mode
Problem: Metadata is replaced with HTTP Request node's metadata
Solution: Enable Proxy Streaming Chunks option and ensure hooks are available (should work by default in n8n)
Problem: Streaming not working
Solution: Verify that:
- Webhook is set to "Using Respond to Webhook Node"
- Respond to Webhook has streaming enabled
- HTTP Request (Streaming) has
Enable Streaming = true
Version: 0.2.4
Author: Leandro Menezes