Migrating analytics workloads to the public cloud has been one of the most significant big data trends in recent years—and it shows no sign of slowing down:

  • In 2021, public cloud infrastructure will grow by 35 percent.
  • The public cloud service market will reach $623 billion worldwide by 2023.
  • Half of all enterprises spend more than $1.2 million on cloud services every year.

(Source)

Of course, before you can process data in the public cloud, that data has to get there in the first place via data migration. Enterprises need a robust, mature data migration solution to deal with challenges such as data silos, increasing data volume and complexity, and security and compliance issues like GDPR, CCPA, and HIPAA.

Azure Data Factory is a data migration service from the Microsoft Azure cloud computing platform that helps Azure users build ETL pipelines for their enterprise data. But with multiple options and configurations available for Azure Data Factory, which is suitable for your business? 

Understanding Microsoft ETL with Azure Data Factory is tough. So we’ll explain the benefits of this method for handling analytic workloads in the public cloud. Then we'll tell you about an alternative that will streamline the entire process. 

Table of Contents

Integrate.io is the No.1 ETL solution for data-driven teams. Extract, transform and load data without the fuss. Schedule a demo today to discover more. 

What is ETL?

ETL (which stands for extract, transform, load) is one of the most commonly used paradigms for the data integration process.

The three steps of ETL are:

  • Extract: Extracting data from a source location, such as a file or database.
  • Transform: Transforming the data from its source format to fit the target location's schema.
  • Load: Finally, loading the data into a target location such as a data warehouse for analytics and reporting.

The data you need for your analytics workloads may exist in many disparate forms and locations, both internal and external to your organization. For maximum efficiency, this data needs to be stored in a centralized repository, such as a data warehouse. Proper storage in this format is essential to keep your data easy to access and analyze. And ETL is a crucial part of the data migration process, making it easier and more efficient to integrate many data sources.

On a final note: ETL differs from ELT, another data integration paradigm. As you may be able to tell from the acronyms, ETL and ELT differ in the order in which they perform the "load" and "transform" stages. ELT transforms the data once loaded in the data warehouse. ELT allows data professionals to pick the data they want to transform, saving time when ingesting large quantities of unstructured information. 

What Are the Benefits of ETL in 2021?

As more businesses require real-time data intelligence, ETL has never been more valuable. Here are some benefits of incorporating ETL into your organization:

  • Speed up the time it takes to build data pipelines. ETL automates many of the processes associated with data transformation, so you can concentrate on other tasks.
  • ETL suits small- and medium-sized companies that lack a large data engineering team. The best ETL tools require no code, allowing users to create dynamic pipelines without much effort.
  • Reduce human error and other problems that arise from manual pipeline-building by automating the ETL process. ETL tools can improve compliance with data governance frameworks and prevent expensive penalties for data protection infractions. 
  • ETL tools let you validate data before moving it to another destination. This way, you can remove unwanted data, duplicated data sets, or data that doesn't comply with the law. 

What is Azure Data Factory?

Azure Data Factory is a fully managed data migration and integration service that enables Microsoft Azure users to bring together data from a variety of sources in the Azure public cloud. Companies like Adobe and Concentra use the tool to combine data from various locations and move it to a virtual environment. 

The Azure Data Factory service allows users to integrate both on-premises data in Microsoft SQL Server, as well as cloud data in the Azure SQL Database, Azure Blob Storage, and Azure Table Storage.

Once Azure Data Factory collects the relevant data, it can quickly be processed by tools like Azure HDInsight (Apache Hive and Apache Pig). Azure Data Factory automates and orchestrates the entire data integration process from end to end, ensuring that users have a single pane of glass in their ETL data pipelines.

According to Microsoft, Azure Data Factory is "more of an Extract-and-Load (EL) and Transform-and-Load (TL) platform rather than a traditional Extract-Transform-and-Load (ETL) platform." Part of understanding ETL with Azure is knowing that Azure Data Factory is more focused on orchestrating and migrating the data itself, rather than performing complex data transformations during the data migration process.

In addition, Azure Data Factory is technically not a full ETL tool on its own. This is because: it defines control flows that can execute various tasks, which may or may not act upon a data source. Until recently, however, Azure Data Factory did not include support for data flows that handle migrating information. Luckily, that has changed, increasing its appeal to users. 

Microsoft regularly adds features to Azure Data Recovery. In December 2020, the company introduced a new service called Azure Purview, which helps organizations adhere to data governance frameworks like HIPAA when processing and sharing sensitive data. 

Related reading: Why Data Engineers Should Consider Microsoft Azure

What is SSIS?

Microsoft SSIS (SQL Server Integration Services) is an on-premises data migration and integration tool that is part of the Microsoft SQL Server database software.

SSIS first appeared in SQL Server 2005 as a replacement for Microsoft's Data Transformation Services (DTS) toolkit. Before introducing Azure Data Factory, Microsoft considered SSIS the dominant tool for building data integration and transformation pipelines to and from an SQL Server.

With a wide range of capabilities, SSIS includes some helpful features, such as:

  • Executing SQL statements
  • Collecting, cleansing, and merging data sources
  • Extracting data from sources such as databases (SQL Server, Oracle, Db2, etc.) and Excel spreadsheets
  • Defining ETL data sources and targets
  • User-friendly graphical tools and wizards

Despite the arrival of Azure Data Factory, SSIS isn’t going away soon. You could even say that the two tools have a friendly rivalry right now. Newer versions of Azure Data Factory include the Integration Runtime, a feature that offers data integration capabilities across different network environments. (Microsoft released the latest edition of Integration Runtime in September 2021.) In particular, this feature allows Azure Data Factory to execute SSIS packages (automated import and export pipelines between different data sources).

What are Mapping Data Flows?

With the shift to the public cloud, Microsoft has had to rethink its ETL and data migration offerings. SSIS is suitable for on-premises and IaaS (infrastructure as a service) workloads, but not so much for the public cloud. 

Mapping Data Flows is a feature in Azure Data Factory made available in October 2019. With Mapping Data Flows, Azure Data Factory can become a more complete ETL solution, combining both control flows and data flows to migrate information both in and out of data warehouses.

By using Mapping Data Flows, Azure customers can build data transformations with an easy-to-use visual interface, all without having to write lines of code. They can then execute these data flows as activities within Azure Data Factory pipelines.

In the words of Mike Flasko, partner director of product management at Microsoft: “Data Factory now empowers users with a code-free, serverless environment [Mapping Data Flows] that simplifies ETL in the cloud and scales to any data size, no infrastructure management required.”

The convenience of Mapping Data Flows’ WYSIWYG environment offers Azure Data Factory users additional flexibility to develop big data pipelines as best fit their needs, whether that means code-first or no-code. This broadens the appeal of the service to people who might be put off by tools that require advanced coding skills.

Related reading: What Is No-Code?

The ETL activities supported by Mapping Data Flows include:

  • Join
  • Aggregate
  • Pivot
  • Unpivot
  • Split
  • Lookup
  • Sort

Mapping Data Flows are an important step in resolving the well-documented data scientist shortage that’s currently plaguing the tech industry. “Citizen data scientists”—non-technical employees who need access to data-driven insights—can use Mapping Data Flows to build ETL pipelines that simplify the data integration and transformation process.  

Microsoft ETL: SSIS or Azure Data Factory's Mapping Data Flows?

With all that said, what’s the best way to do ETL in Azure Data Factory?

Mapping Data Flows are the newest way to perform ETL in Azure Data Factory, but they’re far from the only way. Executing SSIS packages from within Azure Data Factory is certainly still a viable way to maintain your on-premises data workloads, thanks to Azure Data Factory's popular Integration Runtime feature.

Both Mapping Data Flows and SSIS dramatically simplify the process of constructing ETL data pipelines. SSIS can run on-premises, in the cloud, or a hybrid cloud environment, meaning there’s a lot of flexibility there. Note that Mapping Data Flows is currently available for cloud data migration workflows only.

So, should you combine SSIS with Azure Data Factory? The right answer depends on the specifics of your situation. Azure Data Factory is a robust tool that’s great for handling large volumes of data in the cloud, while SSIS is more lightweight and better suited for smaller jobs. Also, you should consider whether it’s worth the hassle to use both technologies as you’ll need to make sure they aren’t stepping on each other's toes as you use both. If you lack experience with these tools, invest in an automated ETL platform that does the hard work for you. (More on that in the next section.)

Related reading: Allow Integrate.io Access To My Data on Azure Blob Storage

Azure Data Factory Alternatives

Despite its full feature set and positive reception, Azure Data Factory has a few crucial limitations you should know about as you work on understanding ETL with Azure. Most obviously, Azure Data Factory suits Azure customers who simply need to integrate data from Microsoft and Azure sources quickly and easily.

Azure Data Factory has around 90 native connectors such as Amazon Redshift, MongoDB, and Salesforce. However, the tool's integrations are a lot less extensive than other data integration solutions.

Another detail to keep in mind is that Integrate.io is an ETL data integration platform that makes it easy to construct pipelines from all your ETL sources into a cloud data warehouse. With a simple drag-and-drop interface and over 100 pre-built integrations, Integrate.io enables you to build powerful, information-rich ETL workflows, so that you can start getting smarter business insights right away.

In fact, according to the business software review website G2, Integrate.io has an average rating of 4.3 out of 5 stars. It remains one of the most popular data integration platforms out there. 

Here's what users say about the platform:

  • "I like that Integrate.io is user-friendly. You don't need to know code to connect systems so having a technical background is a plus but it's not required. I also really appreciate the support. If we run into an issue or are stumped on a workaround, the support staff guides us correctly to a solution." (An administrator in financial services.)
  • "Integrate.io support is top-notch. They are constantly striving to ensure that you are having the best experience possible with their product." (An administrator in marketing/advertising.)
  • "Integrate.io is very flexible which helps us meet our needs." (Neil A.)

Understanding Microsoft ETL with Azure Data Factory: How Integrate.io Can Help

Azure Data Factory is a robust and mature solution for integrating structured, semi-structured, and unstructured data from sources such as Microsoft SQL Server, Azure SQL Database, Azure Blob Storage, and Azure Table Storage. It also integrates well with Microsoft’s BI and analytics solutions, such as Power BI and Azure HDInsight.

However, if you’re looking for a cloud data integration solution with a greater range than Azure Data Factory, give Integrate.io a try. 

Integrate.io extracts data from multiple sources, transforms that data into the correct format and loads it to a final destination for analytics, providing users with the real-time business intelligence they require. Plus, Integrate.io users get world-class customer service, simple pricing, a powerful REST API, and over 100 out-of-the-box native connectors that simplify data pipeline-building.

The Integrate.io data integration platform includes many benefits that might appeal to Azure users, such as integrations with Azure SQL Database and Azure Blob Storage.

Want to find out how Integrate.io can help you build data-driven workflows and get innovative business insights? Schedule a demo today with our team of data integration experts for an understanding of Microsoft ETL with Azure Data Factory.