Overview
This node interacts with a data platform called Lake to perform operations on rows within tables. Specifically, the "Row" resource's "Update" operation allows users to update existing rows in a specified table. This is useful for scenarios where you need to modify records programmatically based on incoming data, such as updating customer information, inventory counts, or any tabular data managed in Lake.
For example, you might use this node to:
- Update user profile details when new information arrives from another system.
- Modify order statuses in an e-commerce database.
- Change values in a project management table based on external triggers.
The node supports multiple API versions and authentication methods, adapting its requests accordingly.
Properties
| Name | Meaning |
|---|---|
| Authentication | Choose between two authentication methods: "API Token" or "User Token". |
| API Version | Select the API version to use: "Before v0.90.0", "v0.90.0 Onwards", or "v0.200.0 Onwards". This affects endpoint URLs and request formats. |
| Workspace Name or ID | (Only for API version 3) Select or specify the workspace by name or ID where the base resides. |
| Base Name or ID | (API version 2 and 3) Select or specify the base/project by name or ID that contains the target table. |
| Project ID | (API version 1) The ID of the project containing the table. |
| Table Name or ID | (API version 2 and 3) Select or specify the table by name or ID where the row exists. |
| Table | (API version 1) The name of the table where the row exists. |
| Primary Key Type | (API versions 1 and 2, for delete and update operations) Defines which primary key field to use: "Default" (id), "Imported From Airtable" (ncRecordId), or "Custom" (specify custom primary key field name). |
| Field Name (customPrimaryKey) | (API versions 1 and 2, when Primary Key Type is "Custom") Specify the name of the custom primary key field. |
| Row ID Value (id) | The value of the primary key identifying the row to update. Required for update operations in API versions 1 and 2. |
| Data to Send | Choose how to provide data for the update: "Auto-Map Input Data to Columns" (input JSON fields must match column names) or "Define Below for Each Column" (manually specify fields and values). |
| Inputs to Ignore | (When using auto-map mode) Comma-separated list of input properties to exclude from being sent to the API. |
| Fields to Send | (When defining fields manually) A collection of fields specifying the field name, whether it is binary data, the value to set, or the binary property name to upload. |
Output
The output is an array of JSON objects representing the updated rows. For API version 3, the response includes the updated record data merged with the input data. For earlier versions, the output reflects the updated fields or success status per row.
If binary data fields are included, the node handles uploading files and stores references to these uploads in the respective fields.
Dependencies
- Requires an API authentication token or user token credential configured in n8n.
- Depends on the Lake API endpoints, which vary by selected API version.
- Uses helper functions for making HTTP requests and handling binary data uploads.
- For API version 3, workspace and base selection is required.
- The node dynamically loads options for workspaces, bases/projects, and tables via API calls.
Troubleshooting
Error: Row not found or could not be updated
This occurs if the specified row ID does not exist. Verify the primary key value and ensure the row exists in the target table.Authentication errors
Ensure the correct authentication method is selected and valid credentials are provided.Field mapping issues in auto-map mode
If input data fields do not exactly match the table columns, updates may fail or send incorrect data. Use the manual field definition mode to avoid this.Binary file upload failures
When uploading binary data, ensure the binary property name is correct and the file data is accessible.API version mismatch
Selecting the wrong API version can cause endpoint errors. Confirm the API version matches your Lake instance.
Links and References
- n8n Expressions Documentation
- Lake API documentation (refer to your Lake instance docs for specific API version details)
- n8n Binary Data Handling Guide (for managing file uploads)