Model Context Protocol (MCP) finally gives AI models a way to access the business data needed to make them really useful at work. CData MCP Servers have the depth and performance to make sure AI has access to all of the answers.
Try them now for free →How to connect PolyBase to Azure Data Lake Storage
Use CData drivers and PolyBase to create an external data source in SQL Server 2019 with access to live Azure Data Lake Storage data.
PolyBase for SQL Server allows you to query external data by using the same Transact-SQL syntax used to query a database table. When paired with the CData ODBC Driver for Azure Data Lake Storage, you get access to your Azure Data Lake Storage data directly alongside your SQL Server data. This article describes creating an external data source and external tables to grant access to live Azure Data Lake Storage data using T-SQL queries.
NOTE: PolyBase is only available on SQL Server 19 and above, and only for Standard SQL Server.
The CData ODBC drivers offer unmatched performance for interacting with live Azure Data Lake Storage data using PolyBase due to optimized data processing built into the driver. When you issue complex SQL queries from SQL Server to Azure Data Lake Storage, the driver pushes down supported SQL operations, like filters and aggregations, directly to Azure Data Lake Storage and utilizes the embedded SQL engine to process unsupported operations (often SQL functions and JOIN operations) client-side. And with PolyBase, you can also join SQL Server data with Azure Data Lake Storage data, using a single query to pull data from distributed sources.
Connect to Azure Data Lake Storage
If you have not already, first specify connection properties in an ODBC DSN (data source name). This is the last step of the driver installation. You can use the Microsoft ODBC Data Source Administrator to create and configure ODBC DSNs. To create an external data source in SQL Server using PolyBase, configure a System DSN (CData Azure Data Lake Storage Sys is created automatically).
Authenticating to a Gen 1 DataLakeStore Account
Gen 1 uses OAuth 2.0 in Azure AD for authentication.
For this, an Active Directory web application is required. You can create one as follows:
To authenticate against a Gen 1 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen1.
- Account: Set this to the name of the account.
- OAuthClientId: Set this to the application Id of the app you created.
- OAuthClientSecret: Set this to the key generated for the app you created.
- TenantId: Set this to the tenant Id. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
Authenticating to a Gen 2 DataLakeStore Account
To authenticate against a Gen 2 DataLakeStore account, the following properties are required:
- Schema: Set this to ADLSGen2.
- Account: Set this to the name of the account.
- FileSystem: Set this to the file system which will be used for this account.
- AccessKey: Set this to the access key which will be used to authenticate the calls to the API. See the property for more information on how to acquire this.
- Directory: Set this to the path which will be used to store the replicated file. If not specified, the root directory will be used.
Click "Test Connection" to ensure that the DSN is connected to Azure Data Lake Storage properly. Navigate to the Tables tab to review the table definitions for Azure Data Lake Storage.
Create an External Data Source for Azure Data Lake Storage Data
After configuring the connection, you need to create a master encryption key and a credential database for the external data source.
Creating a Master Encryption Key
Execute the following SQL command to create a new master key, 'ENCRYPTION,' to encrypt the credentials for the external data source.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'password';
Creating a Credential Database
Execute the following SQL command to create credentials for the external data source connected to Azure Data Lake Storage data.
NOTE: Since Azure Data Lake Storage does not require a User or Password to authenticate, you may use whatever values you wish for IDENTITY and SECRET.
CREATE DATABASE SCOPED CREDENTIAL adls_creds WITH IDENTITY = 'username', SECRET = 'password';
Create an External Data Source for Azure Data Lake Storage
Execute a CREATE EXTERNAL DATA SOURCE SQL command to create an external data source for Azure Data Lake Storage with PolyBase:
- Set the LOCATION parameter , using the DSN and credentials configured earlier.
For Azure Data Lake Storage, set SERVERNAME to the URL or address for your server (e.g. 'localhost' or '127.0.0.1' for local servers; the remote URL for remote servers). Leave PORT empty. PUSHDOWN is set to ON by default, meaning the ODBC Driver can leverage server-side processing for complex queries.
CREATE EXTERNAL DATA SOURCE cdata_adls_source WITH ( LOCATION = 'odbc://SERVER_URL', CONNECTION_OPTIONS = 'DSN=CData Azure Data Lake Storage Sys', -- PUSHDOWN = ON | OFF, CREDENTIAL = adls_creds );
Create External Tables for Azure Data Lake Storage
After creating the external data source, use CREATE EXTERNAL TABLE statements to link to Azure Data Lake Storage data from your SQL Server instance. The table column definitions must match those exposed by the CData ODBC Driver for Azure Data Lake Storage. You can refer to the Tables tab of the DSN Configuration Wizard to see the table definition.

Sample CREATE TABLE Statement
The statement to create an external table based on a Azure Data Lake Storage Resources would look similar to the following:
CREATE EXTERNAL TABLE Resources( FullPath [nvarchar](255) NULL, Permission [nvarchar](255) NULL, ... ) WITH ( LOCATION='Resources', DATA_SOURCE=cdata_adls_source );
Having created external tables for Azure Data Lake Storage in your SQL Server instance, you are now able to query local and remote data simultaneously. Thanks to built-in query processing in the CData ODBC Driver, you know that as much query processing as possible is being pushed to Azure Data Lake Storage, freeing up local resources and computing power. Download a free, 30-day trial of the ODBC Driver for Azure Data Lake Storage and start working with live Azure Data Lake Storage data alongside your SQL Server data today.