Databricks Writer Parameters

Version 2.1.0 of this package added Database Connections support for Databricks Credentials and Cloud Upload Credentials.

To define a new connection from the Connection parameter in a Databricks format: Select Add Database Connection, and scroll to Databricks.

See database-specific parameters below, as well as the section Adding a Database Connection in a workspace in Using Database Connections.

The new connection can be made visible only to the current user, or can be shared among multiple users. To select an existing, previously defined connection, see the section Reusing a Database Connection in Using Database Connections.

Databricks Credentials

Server Hostname: The URL of the Databricks workspace. This will take the format https://<workspace_id>.cloud.databricks.com/ or https://adb-<id>.azuredatabricks.net/.

Cluster ID: The ID of the cluster to run Databricks commands with. This cluster should be configured with access to the cloud storage location specified in the writer parameters.

Authentication Method: The method used for authentication.

Personal Access Token

(default)

Allows you to connect using a personal access token from Databricks.

Personal Access Token

Used only when the Authentication Method parameter is set to Personal Access Token.

The personal access token to connect to the specified cluster.

OAuth

Uses an OAuth web connection to a service principal that accesses the Databricks service.

Adding a Databricks OAuth Connection

Used only when the OAuth authentication method is selected and you click Add Web Connection.

The OAuth web connection used to connect to the specified cluster.

  • Token Endpoint URL – The URL used to request an OAuth connection token.
  • Client ID – The client ID for the OAuth secret for the service principal.
  • Client Secret – The value of the OAuth secret for the service principal.
  • Scope – The scope of the requested OAuth access token. For Microsoft Azure-hosted Databricks clusters, the scope value must be set to 2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default. See Get a Microsoft Entra ID access token with the Microsoft identity platform REST API from the Azure Databricks documentation website for more information on the required scope value.
More Information

Catalog: The catalog to write to. Each writer can only write to a single catalog. Click the [...] button to see a list of accessible catalogs in the workspace.

Cloud Upload Credentials

Storage Type: Amazon S3 | Microsoft Azure Data Lake Gen 2 | Databricks Unity Catalog Volume

Select the form of cloud storage to use as a staging area. The Databricks cluster selected should be configured with its own access to this location. The cloud storage credentials provided here will strictly be used by FME to upload data to the cloud storage location and will not be passed through in any Databricks commands.

Advanced