communiti-message-aggregator

Intelligently batch and combine messages within a specified time window for n8n

Package Information

Downloads: 1 weeklyĀ /Ā 16 monthly
Latest Version: 1.0.6
Author: Communiti Node Team

Documentation

n8n-nodes-communiti-message-aggregator

Intelligently aggregate messages with smart waiting strategies for n8n workflows.

Installation

npm install n8n-nodes-communiti-message-aggregator

Features

Processing Strategies

  • Smart Wait: Wait for a period with no new messages before processing
  • Immediate on Complete: Process immediately when detecting completion words
  • Max Count: Process when reaching maximum message count

Key Capabilities

  • Group messages by any field (user_id, thread_id, session_id, etc.)
  • Multiple aggregation methods (newline, space, comma, custom separator)
  • Persistent storage with Supabase database
  • Automatic table creation and management
  • Comprehensive output with metadata and processing details

šŸ”§ Setup

1. Create Supabase Credentials

  1. Go to your Supabase project dashboard
  2. Navigate to Settings > API
  3. Copy your:
    • Project URL: https://your-project.supabase.co
    • Service Role Key: (for table creation)
    • Anon Key: (for regular operations)

2. Add Credentials in n8n

  1. Go to Credentials in n8n
  2. Click Add Credential
  3. Search for "Supabase Message Aggregator API"
  4. Fill in your Supabase details

3. Setup Database Tables

Option 1: Automatic Setup (Recommended)

  1. Add Message Aggregator node to your workflow
  2. Enable Auto Create Tables option
  3. Set your desired Table Prefix (default: "message_aggregator")
  4. Execute the node - it will automatically create tables if they don't exist

Option 2: Manual Setup

  1. Open your Supabase SQL Editor
  2. Copy and run the SQL from supabase-setup.sql file (included in the package)
  3. Or run this SQL manually:
-- Create the stored procedure
CREATE OR REPLACE FUNCTION create_message_aggregator_tables(table_prefix TEXT)
RETURNS TEXT
LANGUAGE plpgsql
SECURITY DEFINER
AS $$
BEGIN
    -- Create buffer table
    EXECUTE format('
        CREATE TABLE IF NOT EXISTS %I_buffer (
            id BIGSERIAL PRIMARY KEY,
            group_key TEXT NOT NULL,
            message TEXT NOT NULL,
            metadata JSONB,
            workflow_id TEXT,
            execution_id TEXT,
            created_at TIMESTAMPTZ DEFAULT NOW()
        )', table_prefix);
    
    -- Create indexes
    EXECUTE format('CREATE INDEX IF NOT EXISTS idx_%I_buffer_group_key ON %I_buffer(group_key)', table_prefix, table_prefix);
    EXECUTE format('CREATE INDEX IF NOT EXISTS idx_%I_buffer_created_at ON %I_buffer(created_at)', table_prefix, table_prefix);
    
    -- Create stats table
    EXECUTE format('
        CREATE TABLE IF NOT EXISTS %I_stats (
            id BIGSERIAL PRIMARY KEY,
            group_key TEXT NOT NULL,
            message_count INTEGER NOT NULL,
            strategy TEXT,
            trigger TEXT,
            workflow_id TEXT,
            processed_at TIMESTAMPTZ DEFAULT NOW()
        )', table_prefix);
    
    -- Create indexes
    EXECUTE format('CREATE INDEX IF NOT EXISTS idx_%I_stats_workflow_id ON %I_stats(workflow_id)', table_prefix, table_prefix);
    EXECUTE format('CREATE INDEX IF NOT EXISTS idx_%I_stats_processed_at ON %I_stats(processed_at)', table_prefix, table_prefix);
    
    RETURN 'Tables created successfully for prefix: ' || table_prefix;
END;
$$;

-- Create default tables
SELECT create_message_aggregator_tables('message_aggregator');

Note: If automatic table creation fails, the node will provide the exact SQL commands in the console log for manual execution.

šŸš€ Usage

Basic Message Aggregation

  1. Add Message Aggregator node to your workflow
  2. Set Operation to "Aggregate Messages"
  3. Configure:
    • Wait Time: How long to wait for additional messages (default: 15 seconds)
    • Group By Field: Field to group messages by (e.g., "threadId")
    • Message Field: Field containing message content (e.g., "message")
    • Combine Method: How to join messages (newline, space, custom)

Example Workflow

Webhook → Message Aggregator → Send to AI/API

Input Messages:

[
  {"threadId": "123", "message": "Hello"},
  {"threadId": "123", "message": "How are you?"},
  {"threadId": "123", "message": "I need help"}
]

Output (after 15 seconds):

{
  "threadId": "123",
  "message": "Hello\nHow are you?\nI need help",
  "messageCount": 3,
  "aggregatedAt": "2024-01-15T10:30:00Z"
}

āš™ļø Configuration Options

Aggregate Messages

Parameter Type Default Description
Wait Time Number 15 Seconds to wait for additional messages
Group By Field String "threadId" Field to group messages by
Message Field String "message" Field containing message content
Combine Method Options "newline" How to join messages
Custom Separator String " | " Custom separator (if selected)
Max Messages Number 50 Maximum messages per batch

Setup Database

Parameter Type Default Description
Table Prefix String "msg_agg_" Prefix for created tables
Database Schema String "public" Database schema to use
Enable RLS Boolean true Enable Row Level Security

šŸ“Š Database Tables

The node automatically creates two tables:

msg_agg_buffer

Temporary storage for incoming messages

  • id: Primary key
  • group_key: Grouping field value
  • message_content: Message content
  • original_data: Full original message data
  • created_at: Message timestamp
  • expires_at: When message expires
  • workflow_id: n8n workflow ID
  • execution_id: n8n execution ID

msg_agg_stats

Statistics and monitoring

  • id: Primary key
  • workflow_id: n8n workflow ID
  • group_key: Grouping field value
  • message_count: Number of messages in batch
  • wait_time_seconds: Wait time used
  • combined_length: Length of combined message
  • processed_at: Processing timestamp

šŸ”’ Security

  • Row Level Security: Automatically enabled on created tables
  • Credential Management: Secure storage of Supabase credentials
  • Data Isolation: Each workflow execution is isolated

šŸ› ļø Development

Building

npm run build

Linting

npm run lint
npm run lintfix

Testing

npm test

šŸ“ Examples

Chat Message Batching

Perfect for chatbots that need to process multiple rapid messages:

{
  "waitTime": 10,
  "groupByField": "userId",
  "messageField": "text",
  "combineMethod": "newline"
}

Notification Aggregation

Batch notifications before sending emails:

{
  "waitTime": 60,
  "groupByField": "recipientEmail",
  "messageField": "notificationText",
  "combineMethod": "custom",
  "customSeparator": "\n• "
}

Event Processing

Group related events for batch processing:

{
  "waitTime": 30,
  "groupByField": "eventSource",
  "messageField": "eventData",
  "combineMethod": "space"
}

šŸ¤ Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

šŸ“„ License

MIT License - see LICENSE file for details.

šŸ†˜ Support


Made with ā¤ļø for the n8n community

Discussion