Actions35
- Genie Actions
- Databricks SQL Actions
- Unity Catalog Actions
- Model Serving Actions
- Files Actions
- Vector Search Actions
Overview
The node provides integration with the Databricks API, specifically supporting operations on various resources including Unity Catalog. For the Unity Catalog resource, it allows managing catalog objects such as functions, schemas, and tables.
The "Delete Function" operation under the Unity Catalog resource enables users to delete a function from a specified catalog and schema within Databricks Unity Catalog. This is useful for cleaning up or removing obsolete or unwanted user-defined functions in your data environment.
Practical examples include:
- Removing outdated or deprecated SQL functions from a specific schema.
- Automating cleanup of development or test functions after deployment cycles.
- Managing function lifecycle programmatically as part of CI/CD pipelines interacting with Databricks.
Properties
| Name | Meaning |
|---|---|
| Catalog | The name of the catalog where the target function resides. |
| Schema | The schema within the catalog that contains the function to be deleted. |
These properties are required inputs to specify the exact location of the function to delete.
Output
The node outputs JSON data representing the result of the delete operation. Typically, this would include confirmation of deletion or any error messages returned by the Databricks API.
No binary data output is expected for this operation.
Dependencies
- Requires an active Databricks API authentication token (provided via credentials).
- The node depends on the Databricks REST API endpoint configured through the host URL.
- Proper permissions on the Databricks workspace to delete functions in Unity Catalog are necessary.
Troubleshooting
Common issues:
- Insufficient permissions to delete functions in the specified catalog/schema.
- Incorrect catalog or schema names leading to "function not found" errors.
- Network or authentication failures due to invalid or expired API tokens.
Error messages:
- Unauthorized or Forbidden: Check API token validity and user permissions.
- Function not found: Verify the catalog and schema names, and ensure the function exists.
- Network errors: Confirm connectivity to the Databricks API endpoint.
Resolving these typically involves verifying credentials, input parameters, and user access rights.