Actions35
- Genie Actions
- Databricks SQL Actions
- Unity Catalog Actions
- Model Serving Actions
- Files Actions
- Vector Search Actions
Overview
This node integrates with the Databricks API, specifically providing access to the Unity Catalog resource. The "List Tables" operation retrieves a list of tables within a specified catalog and schema in the Unity Catalog. This is useful for users who want to programmatically explore or audit their data assets managed by Databricks Unity Catalog.
Practical examples include:
- Automatically fetching available tables before running queries or data processing workflows.
- Building dynamic interfaces or reports that depend on current table metadata.
- Auditing or monitoring data assets in a Databricks environment.
Properties
| Name | Meaning |
|---|---|
| Catalog | The catalog name within Unity Catalog where the query will be executed. |
| Schema | The schema name inside the specified catalog to list tables from. |
Output
The output contains a JSON field with the list of tables retrieved from the specified catalog and schema. Each item in the list represents a table with its associated metadata as returned by the Databricks Unity Catalog API.
No binary data output is indicated for this operation.
Dependencies
- Requires an active Databricks account with Unity Catalog enabled.
- Needs an API authentication token (API key or personal access token) configured in n8n credentials to authorize requests.
- The node uses the base URL and authorization header derived from these credentials to communicate with the Databricks REST API.
Troubleshooting
Common issues:
- Invalid or missing API token can cause authentication failures.
- Incorrect catalog or schema names will result in empty results or errors.
- Network connectivity problems may prevent reaching the Databricks API endpoint.
Error messages:
- Authentication errors typically indicate invalid or expired tokens; renewing or reconfiguring credentials resolves this.
- "Resource not found" or similar errors suggest incorrect catalog/schema inputs; verify spelling and existence in Databricks.
- Timeout or connection errors require checking network settings and API endpoint accessibility.