Overview
This node allows you to create new records (rows) in a specified Google BigQuery table. It is designed for scenarios where you need to automate the insertion of data into BigQuery from n8n workflows, such as syncing data from other sources, logging events, or batch uploading processed results.
Practical examples:
- Automatically add new customer signups to a BigQuery analytics table.
- Log error events from various systems into a centralized BigQuery dataset.
- Batch import CSV or API data into BigQuery for further analysis.
Properties
| Name | Type | Meaning |
|---|---|---|
| Authentication | options | Selects the authentication method: OAuth2 (Recommended) or Service Account. Determines which credentials are used to access Google BigQuery. |
| Project Name or ID | options | The ID of the Google Cloud project where the target BigQuery dataset resides. You can select from a list or specify an ID using an expression. |
| Dataset Name or ID | options | The ID of the BigQuery dataset where the record will be created. Depends on the selected project. |
| Table Name or ID | options | The ID of the BigQuery table where the record will be inserted. Depends on the selected project and dataset. |
| Columns | string | Comma-separated list of property names to use as columns for the new record(s). Only these fields from the input data will be included in the inserted row(s). |
| Options | collection | Additional options for record creation: - Ignore Unknown Values (boolean): Ignore values that do not match the schema. - Skip Invalid Rows (boolean): Skip rows with invalid values. - Template Suffix (string): Create a new table based on the destination table and insert rows there. - Trace ID (string): Unique request ID for debugging (auto-generated if not provided). |
Output
The output is an array containing the response from the Google BigQuery API after attempting to insert the records. Each item in the output corresponds to the result of the insert operation:
[
{
"kind": "bigquery#tableDataInsertAllResponse",
"insertErrors": [
{
"index": 0,
"errors": [
{
"reason": "invalid",
"message": "No such field: foo"
}
]
}
]
}
]
- If the operation succeeds, the output contains the API's response, which may include information about any errors encountered during insertion.
- If an error occurs and "Continue On Fail" is enabled, the output will contain an object like
{ "error": "Error message" }.
Note: This node does not output binary data.
Dependencies
- Google BigQuery API: Requires access to the Google BigQuery service.
- Authentication: Either OAuth2 or Service Account credentials must be configured in n8n.
- n8n Credentials:
googleApi(for Service Account)googleBigQueryOAuth2Api(for OAuth2)
Troubleshooting
Common Issues:
- Invalid Credentials: Ensure that the correct authentication method and credentials are set up in n8n.
- Missing Required Fields: Make sure all required properties (Project, Dataset, Table, Columns) are provided.
- Schema Mismatch: If the input data contains fields not present in the table schema, and "Ignore Unknown Values" is not enabled, the API may return errors.
- API Quotas/Permissions: The service account or OAuth2 user must have sufficient permissions to insert data into the specified table.
Common Error Messages:
"No such field: ...": A column specified in "Columns" does not exist in the table schema. Check your column names and table structure."Access Denied": The credentials do not have permission to access the specified resource. Verify IAM roles and permissions."Table not found": The specified table does not exist. Double-check the project, dataset, and table IDs.