Actions35
- Genie Actions
- Databricks SQL Actions
- Unity Catalog Actions
- Model Serving Actions
- Files Actions
- Vector Search Actions
Overview
This node integrates with the Databricks API, specifically supporting operations on various resources including Unity Catalog. The "Create Function" operation under the Unity Catalog resource allows users to create a new function within a specified catalog and schema in their Databricks environment.
Common scenarios for this node include automating the management of data governance and metadata by programmatically creating functions in Unity Catalog schemas. For example, a data engineer might use this node to deploy SQL or Python functions as part of a data pipeline setup or to maintain consistent function definitions across environments.
Properties
| Name | Meaning |
|---|---|
| Catalog | The name of the catalog where the function will be created. |
| Schema | The schema within the catalog where the function will be created. |
These properties are required inputs that specify the target location in Unity Catalog for the new function.
Output
The node outputs JSON data representing the result of the function creation request. This typically includes details about the newly created function such as its identifier, name, and metadata returned from the Databricks API.
If the node supports binary data output (not evident from the provided code), it would represent any binary content related to the function creation response, but no such indication is present here.
Dependencies
- Requires an active Databricks account with appropriate permissions to access Unity Catalog.
- Needs an API authentication token configured in n8n credentials to authorize requests to the Databricks API.
- The node uses the base URL and authorization token from these credentials to communicate with the Databricks service.
Troubleshooting
- Authentication errors: If the API token is invalid or expired, the node will fail to authenticate. Ensure the token is current and has sufficient privileges.
- Permission issues: Lack of proper permissions on the specified catalog or schema can cause failures. Verify user roles and access rights in Databricks.
- Invalid catalog or schema names: Providing non-existent or misspelled catalog/schema names will result in errors. Double-check input values.
- API rate limits or network issues: Temporary connectivity problems or hitting API rate limits may cause request failures. Retry after some time or check network settings.