Overview
The Lake node allows users to interact with data stored in a Lake database system by performing various operations on rows within tables. Specifically, the Update Row operation enables updating existing rows in a specified table. This is useful for scenarios where you need to modify records programmatically based on incoming data or workflow logic.
Common use cases include:
- Updating customer information in a CRM table when new details arrive.
- Modifying inventory quantities in a product table after sales.
- Changing status fields in task management tables as workflows progress.
For example, if you have an automation that processes form submissions, you can update corresponding rows in your Lake database to reflect the latest user inputs.
Properties
| Name | Meaning |
|---|---|
| Authentication | Choose the authentication method: - API Token - User Token |
| API Version | Select the API version to use: - Before v0.90.0 - v0.90.0 Onwards - v0.200.0 Onwards |
| Workspace Name or ID | (Only for API version v0.200.0 Onwards) Select or specify the workspace containing the base. |
| Base Name or ID | (API v0.200.0 Onwards) Select or specify the base (project) within the workspace. (API v0.90.0 Onwards) Select or specify the project. (Before v0.90.0) Enter the project ID as string. |
| Table Name or ID | Select or specify the table where the row exists. |
| Primary Key Type | (API versions before v0.200.0) Specify the primary key type used in the table: - Default (id) - Imported From Airtable (ncRecordId) - Custom (enter custom primary key field name) |
| Field Name (Custom PK) | (If Primary Key Type is Custom) Enter the name of the primary key field. |
| Row ID Value | The value of the primary key identifying the row to update. Required for API versions before v0.200.0. |
| Data to Send | How to provide the data for the update: - Auto-Map Input Data to Columns (input JSON keys must match column names) - Define Below for Each Column (manually specify fields and values) |
| Inputs to Ignore | (When using Auto-Map) Comma-separated list of input properties to exclude from sending. Leave empty to send all. |
| Fields to Send | (When using Define Below) Add one or more fields specifying: - Field Name - Whether the field is binary file data - Field Value or Binary Property name containing the file data to upload |
Output
The output is an array of JSON objects representing the updated rows. The structure depends on the API version:
- For older API versions, the output includes the updated fields along with the primary key.
- For newer API versions (v0.200.0 Onwards), the output merges the sent data with the response from the API, which may include additional metadata about the updated record.
If binary data fields are included, the node handles uploading files and stores references to them in the respective fields.
Example output JSON for a single updated row might look like:
{
"id": "123",
"name": "Updated Name",
"status": "active"
}
Dependencies
- Requires an API authentication token or user token credential configured in n8n.
- Depends on the Lake API endpoints, which vary by selected API version.
- For binary file uploads, the node uses multipart/form-data requests to upload files to the Lake storage service.
- The node dynamically loads options for workspaces, bases/projects, and tables via API calls.
Troubleshooting
Error: "The row with the ID ... could not be updated. It probably doesn't exist."
This indicates the specified row ID was not found. Verify the primary key value and ensure the row exists in the target table.Empty or missing required parameters
Ensure all required fields such as Project/Base ID, Table, and Row ID (for update) are correctly set.Binary file upload failures
Confirm that the binary property specified contains valid binary data and that the API credentials have permission to upload files.API version mismatch issues
Selecting the wrong API version may cause endpoint errors. Match the API version setting to your Lake instance's supported version.Network or authentication errors
Check that the API token or user token is valid and has sufficient permissions.
Using the "Continue On Fail" option can help workflows proceed even if some rows fail to update.
Links and References
- Lake API Documentation (generic reference, replace with actual URL)
- n8n Expressions Documentation
- Handling Binary Data in n8n