csv-batch-streamer

Process CSV files by batches of rows

Package Information

Downloads: 167 weekly / 167 monthly
Latest Version: 0.1.1
Author: Max Rubis

Documentation

n8n-nodes-csv-batch-streamer

This is an n8n community node that provides a CSV Batch Streaming node for processing large CSV files row by row. It supports reading CSV files from local filesystem without loading the entire file into memory, making it ideal for processing large datasets efficiently.

n8n is a fair-code licensed workflow automation platform.

Installation

Follow the installation guide in the n8n community nodes documentation.

Features

  • Memory Efficient: Processes CSV files by batches of row without loading the entire file into memory
  • Configurable CSV Options:
    • Custom delimiter support
    • Header row handling
    • Empty line skipping
  • Processing Control: Configurable delay between processing rows to manage resource usage
  • Dual Outputs:
    • 'loop' output for each processed row
    • 'done' output when processing is complete

Configuration

Source Options

  1. Local File
    • Required: File path to the CSV file

CSV Options

  • Delimiter: Character used to separate fields (default: ',')
  • Has Header: Whether the CSV file contains a header row (default: true)
  • Skip Empty Lines: Option to skip empty lines in the CSV (default: true)

Processing Options

  • Processing Delay: Optional delay (in milliseconds) between processing rows

Usage

  1. Add the CSV Batch Streamer node to your workflow
  2. Configure the file path
  3. Optional: Adjust CSV parsing options and processing delay
  4. Connect the 'loop' output to nodes that should process each row
  5. Connect the 'done' output to nodes that should run after all rows are processed

The node will read the CSV file by batches of rows, outputting each batch as a JSON array of objects through the 'loop' output. Once all rows are processed, it will trigger the 'done' output.

Resources

License

MIT

Discussion