How To Send Parameters From ADF to Databricks and Receive Output From Databricks

Tech Zero
4 min readSep 6, 2021

This article explains how to send parameters to Databricks from ADF and receive output from Databricks in ADF.

Quite often as a Data Engineer, I need to use Databricks as part of my Azure Data Factory Data Pipeline. This involves configuring the pipeline’s ability to send parameters to Databricks and in turn, receive output from the Databricks. This article shows you a quick and easy way of how to do it through an example.

Use Case:
A country parameter needs to be sent from ADF to Databricks. Country value is Canada. Databricks will accept the parameter and send an output called continent with value of North America back to ADF.

Requirement:
ADF pipeline should be able to send the parameter to Databricks and in turn, receive the output from Databricks.

Assumption:
A Databricks Notebook is already available.

Step 1: Initialize a New Parameter and Variable in ADF

Open the canvas on ADF and create a new pipeline. To begin with the new pipeline, create a new parameter called ‘country’ and a new variable called ‘continent’. Your pipeline should look like this-

--

--

Tech Zero
Tech Zero

Written by Tech Zero

Product Manager, Data & Governance | Azure, Databricks and Snowflake stack | Here to share my knowledge with everyone

No responses yet