Model Context Protocol (MCP) finally gives AI models a way to access the business data needed to make them really useful at work. CData MCP Servers have the depth and performance to make sure AI has access to all of the answers.
Try them now for free →Connect to Databricks Data as an External Source in Dremio
Use the CData JDBC Driver to connect to Databricks as an External Source in Dremio.
The CData JDBC Driver for Databricks implements JDBC Standards and allows various applications, including Dremio, to work with live Databricks data. Dremio is a data lakehouse platform designed to empower self-service, interactive analytics on the data lake. With the CData JDBC driver, you can include live Databricks data as a part of your enterprise data lake. This article describes how to connect to Databricks data from Dremio as an External Source.
The CData JDBC Driver enables high-speed access to live Databricks data in Dremio. Once you install the driver, authenticate with Databricks and gain immediate access to Databricks data within your data lake. By surfacing Databricks data using native data types and handling complex filters, aggregations, & other operations automatically, the CData JDBC Driver grants seamless access to Databricks data.
About Databricks Data Integration
Accessing and integrating live data from Databricks has never been easier with CData. Customers rely on CData connectivity to:
- Access all versions of Databricks from Runtime Versions 9.1 - 13.X to both the Pro and Classic Databricks SQL versions.
- Leave Databricks in their preferred environment thanks to compatibility with any hosting solution.
- Secure authenticate in a variety of ways, including personal access token, Azure Service Principal, and Azure AD.
- Upload data to Databricks using Databricks File System, Azure Blog Storage, and AWS S3 Storage.
While many customers are using CData's solutions to migrate data from different systems into their Databricks data lakehouse, several customers use our live connectivity solutions to federate connectivity between their databases and Databricks. These customers are using SQL Server Linked Servers or Polybase to get live access to Databricks from within their existing RDBMs.
Read more about common Databricks use-cases and how CData's solutions help solve data problems in our blog: What is Databricks Used For? 6 Use Cases.
Getting Started
Prerequisites
This article assumes you are utilizing Docker to run Dremio. You can create a Docker container with the Dremio service using a command similar to the follow:
docker run -d --name dremio -p 9047:9047 -p 31010:31010 dremio/dremio-oss
Where dremio is the name of the container, 9047 is the container's port for the Dremio web interface and 31010 is the port that maps to the Dremio query service. dremio/dremio-oss specifies the image to use.
Build the ARP Connector
To use the CData JDBC Driver in Dremio, you need to build an Advanced Relation Pushdown (ARP) Connector. You can view the source code for the Connector on GitHub or download the ZIP file (GitHub.com) directly. Once you copy or extract the files, run the following command from the root directory of the connector (the directory containing the pom.xml file) to build the connector.
mvn clean install
NOTE: The CData ARP Connectors are build to be compiled with Java 11. Be sure to install Java 11 and use the correct version. You can update your Java version using a command similar to the following:
sudo update-alternatives --config java
Once the JAR file for the connector is built (in the target directory), you are ready to copy the ARP connector and JDBC Driver to your Dremio instance.
Installing the Connector and JDBC Driver
Install the ARP Connector to %DREMIO_HOME%/jars/ and the JDBC Driver for Databricks to %DREMIO_HOME%/jars/3rdparty. You can use commands similar to the following:
ARP Connector
docker cp PATH\TO\dremio-databricks-plugin-{DREMIO_VERSION}.jar dremio_image_name:/opt/dremio/jars/
JDBC Driver for Databricks
docker cp PATH\TO\cdata.jdbc.databricks.jar dremio_image_name:/opt/dremio/jars/3rdparty/
Connecting to Databricks
Databricks will now appear as an External Source option in Dremio. The ARP Connector built uses a JDBC URL to connect to Databricks data. The JDBC Driver has a built-in connection string designer that you can use (see below).

Built-in Connection String Designer
For assistance in constructing the JDBC URL, use the connection string designer built into the Databricks JDBC Driver. Double-click the JAR file or execute the jar file from the command line.
java -jar cdata.jdbc.databricks.jar
Fill in the connection properties and copy the connection string to the clipboard.
To connect to a Databricks cluster, set the properties as described below.
Note: The needed values can be found in your Databricks instance by navigating to Clusters, and selecting the desired cluster, and selecting the JDBC/ODBC tab under Advanced Options.
- Server: Set to the Server Hostname of your Databricks cluster.
- HTTPPath: Set to the HTTP Path of your Databricks cluster.
- Token: Set to your personal access token (this value can be obtained by navigating to the User Settings page of your Databricks instance and selecting the Access Tokens tab).

NOTE: To use the JDBC Driver in Dremio, you will need a license (full or trial) and a Runtime Key (RTK). For more information on obtaining this license (or a trial), contact our sales team.
Add the Runtime Key (RTK) to the JDBC URL. You will end up with a JDBC URL similar to the following:
jdbc:databricks:RTK=5246...;Server=127.0.0.1;Port=443;TransportMode=HTTP;HTTPPath=MyHTTPPath;UseSSL=True;User=MyUser;Password=MyPassword;
Access Databricks as an External Source
To add Databricks as an External Source, click to add a new source and select Databricks. Copy the JDBC URL and paste it into the New Databricks Source wizard.

Save the connection and you are ready to query live Databricks data in Dremio, easily incorporating Databricks data into your data lake.

More Information & Free Trial
Using the CData JDBC Driver for Databricks in Dremio, you can incorporate live Databricks data into your data lake. Check out our CData JDBC Driver for Databricks page for more information about connecting to Databricks. Download a free, 30 day trial of the CData JDBC Driver for Databricks and get started today.