How to run stored procedure in Azure Data Warehouse using Databricks?

--

Use Case

We have stored procedure written in Azure data warehouse, and want to execute it in databricks using Pyspark.

We are using ODBC driver for connecting with Data warehouse. So let’s start with demo.

Step 1: Install required libraries

Also install library “pyodbc” on working cluster.

Step 2: Connect with Azure data warehouse using ODBC

Step 3: Execute stored procedure in Databricks.

And here is, what we want to achieve.

Hope this will help you to start your journey with Databricks and Azure Data warehouse connectivity with ODBC driver. I’ll discuss more strategies in a future blog post!

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Nachiket Rajput
Nachiket Rajput

Written by Nachiket Rajput

Developer | Writer | Career Generalist | Blogger | Community Person

Responses (1)

Write a response