Actions35
- Genie Actions
- Databricks SQL Actions
- Unity Catalog Actions
- Model Serving Actions
- Files Actions
- Vector Search Actions
Overview
The node provides integration with Databricks, allowing users to interact with various Databricks services through a unified interface. Specifically, for the Files resource and the Delete File operation, the node enables deleting a file located in the Databricks workspace by specifying its path.
This operation is useful in scenarios where automated workflows need to manage files within Databricks workspaces, such as cleaning up temporary files after processing, removing outdated datasets, or managing storage space programmatically.
Example use cases:
- Automatically delete intermediate data files after a data pipeline completes.
- Remove obsolete configuration or log files from a Databricks workspace directory.
- Manage file lifecycle in Databricks as part of a larger automation workflow.
Properties
| Name | Meaning |
|---|---|
| Path | The full path to the file in the Databricks workspace that you want to delete. Example: /Volumes/my-catalog/my-schema/my-volume/directory/file.txt. This property is required. |
Output
The node outputs JSON data representing the result of the delete operation. Typically, this will include confirmation of deletion or any error messages returned by the Databricks API. There is no binary output associated with this operation.
Dependencies
- Requires an active connection to a Databricks workspace.
- Needs an API authentication token (API key or bearer token) configured in the node credentials to authorize requests.
- The base URL for the Databricks API must be set correctly in the credentials.
Troubleshooting
- File Not Found Error: If the specified path does not exist, the node will likely return an error indicating the file could not be found. Verify the exact path and ensure the file exists.
- Permission Denied: Insufficient permissions to delete the file may cause authorization errors. Ensure the API token has appropriate rights.
- Invalid Path Format: The path must follow the Databricks workspace format. Incorrect formatting can lead to errors.
- Authentication Failures: Check that the API token and host URL are correctly configured in the node credentials.