Package Information
Available Nodes
Documentation
n8n-nodes-langfuse
This is an n8n community node. It lets you use Langfuse in your n8n workflows.
Langfuse is an open-source LLM engineering platform that provides observability, metrics, evaluations, prompt management and a playground.
n8n is a fair-code licensed workflow automation platform.
Installation
Operations
Credentials
Development
Resources
Installation
Self-hosted n8n
Follow the installation guide in the n8n community nodes documentation.
npm install @langfuse/n8n-nodes-langfuse
n8n Cloud
This is a verified community node. Search for Langfuse to use this node in n8n Cloud.
Operations
This node supports the following resources and operations:
Prompt
Get prompts from Langfuse Prompt Management.
Get Prompt
- Enter the
nameof the prompt - Enter the
labelthat identifies the prompt version that you want to fetch. Defaults to "production". Learn more about prompt labels in Langfuse here.
Example workflow that retrieves the system prompt for the agent from Langfuse:
Trace
Create and update traces for LLM observability and monitoring.
Create Trace
- Create a new trace to track an AI workflow
- Supports: trace name, custom ID, input/output data, metadata, user ID, session ID, tags, and public visibility
Update Trace
- Update an existing trace with new data
- Useful for adding final outputs or updating metadata
Observation
Create observations (spans, generations, and events) to track operations within a trace.
Create Span
- Track general operations like data processing or API calls
- Supports: input/output, metadata, parent observation ID, timestamps, log levels
Create Generation
- Track LLM calls with detailed model information
- Supports: model name, model parameters, token usage (prompt, completion, total tokens)
- Ideal for monitoring OpenAI, Anthropic, or other LLM provider calls
Create Event
- Record discrete events like errors, warnings, or milestones
- Supports: input/output, log levels, status messages
Score
Create evaluation scores for traces or observations.
Create Score
- Evaluate LLM outputs with numeric, categorical, or boolean scores
- Supports: score name, value, comments, data types, config IDs
- Useful for quality assessment, automated evaluation, or user feedback collection
Use Cases
- LLM Observability: Track all LLM calls in your n8n workflows with detailed metrics
- Performance Monitoring: Monitor token usage, latency, and costs across your AI operations
- Quality Evaluation: Automatically score LLM outputs based on custom criteria
- Debugging: Trace complete execution flows to identify issues in AI pipelines
- Prompt Management: Centralize and version control your prompts in Langfuse
Credentials
To use this node, you need to authenticate with Langfuse. You'll need:
- A Langfuse account, either Langfuse Cloud or self-hosted.
- API credentials from your Langfuse project settings: hostname, public key, secret key
Development
Prerequisites
You need the following installed on your development machine:
- git
- Node.js and npm. Minimum version Node 20. You can find instructions on how to install both using nvm (Node Version Manager) for Linux, Mac, and WSL here. For Windows users, refer to Microsoft's guide to Install NodeJS on Windows.
- Install n8n with:
npm install n8n -g - Recommended: follow n8n's guide to set up your development environment.
Build new version
npm run build
npm link
Test in local n8n
cd ~/.n8n/custom
npm link @langfuse/n8n-nodes-langfuse
Verification
Run the verification script to ensure everything is set up correctly:
./verify.sh
This will:
- ✅ Check project structure
- ✅ Build the project
- ✅ Verify compiled output
- ✅ Run ESLint checks
- ✅ Validate all features are implemented
Import the test workflow (test-workflow.json) into n8n to try out all the features. See TESTING.md for detailed testing instructions.
Resources
- n8n community nodes documentation
- Langfuse documentation
- Langfuse Prompt Management
- Langfuse Tracing