Member-only story

Azure Data Factory: Handle Dynamic Column Names With Rule-Based Mapping In Data Flow

This article explains how to handle dynamic column names using adf mapping data flow

Tech Zero
4 min readJul 4, 2021

As a Data Engineer, working with source file datasets often comes with a painful reality- changing column names. So, how can you ensure that the mapping between source and destination stays intact even if source column names keep changing? The answer lies in Mapping Data flow.

Let’s say you have a source dataset in the form of a csv file. The source dataset looks something like below:

Sample CSV dataset
(Sample CSV dataset)

Use Case:
You are tasked with ingesting data coming from a file (like above) into a SQL table. The file contains name of employees, offices they work from, country where the offices are located in, employee number and their respective department.

Problem:
Some of the columns in the dataset continually change their names and thus the mapping between source and destination keeps failing. You need to come up with a dynamic mapping technique.

Step 1: Create a new Data Flow in your Azure Data

--

--

Tech Zero
Tech Zero

Written by Tech Zero

Product Manager, Data & Governance | Azure, Databricks and Snowflake stack | Here to share my knowledge with everyone

No responses yet