Do you want to learn how to manage and execute SSIS inside of Azure using “Lift and Shift”? In a recent webinar, Manuel Quintana, discussed some of the potential issues you could encounter and how it compares to Azure Data Factory.
If you’re looking for a new and better deployment option, I’d like to tell you about a relatively new offering from Microsoft, Azure SQL Managed Instance. It has nearly 100% feature compatibility with the latest on premise SQL Server Enterprise Edition database engine.
Confession:I put a lot of subtexts in this blog post in an attempt to catch how people may be describing their move from SSIS to ADF, from SQL DBs, to SQL DWs or from scheduled to event-based data ingestion. The purpose of this post is to give you a visual picture of how our well loved “traditional” tools of on-prem SQL Databases, SSIS, SSAS and SSRS are being replaced by the Azure tool stack. If you are moving form “Traditional Microsoft” to “Azure Microsoft” and need a road map, this post is for you.
Summary of the Matter:If you only read one thing, please read this: transitioning to Azure is absolutely “doable”, but do not let anyone sell you “lift and shift”. Azure data architecture is a new way of thinking. Decide to think differently.
First Determine Added Value: Below are snippets from a slide deck I shared during Pragmatic Work’s 2018 Azure Data Week. (You can still sign up for the minimal cost of $29 and watch all 40 recorded sessions, just clickhere.) However, before we begin, let’s have a little chat. Why in the world would anyone take on an Azure migration if their on-prem SQL database(s) and SSIS packages are humming along with optimum efficiency? The first five reasons given below are my personal favorites.
Cost (scale up, scale down)
Event Based File Ingestion
File based history (SCD2 equivalent but in your Azure Data Lake)
Support for Near Real Time Requirements
Support for Unstructured Data
Large Data Volumes
Offset Limited Local IT Resources
Data Science Capabilities
Development Time to Production
Support for large audiences
Each of the reasons given above are a minimum one hour working session on their own, but I’m sharing my thoughts in brief in an effort to help you to get started compiling our own list. Please also look at the following diagram (Figure 1) and note two things: a.) the coinciding “traditional” components and b.) the value add boxed in red.
In his Azure Data Week session, Azure Data Factory – Movement to and in the Cloud, Chris Seferlis takes us through a traditional SSIS package that ETLs the data and presents it for reporting, then compares it to the process in Azure Data with some great tips and roadmap.
There were many questions he was unable to answer during his session and we’re happy to share them with you now. If you missed Chris’ session or the entire week, you can still purchase access to the recordings by visiting azuredataweek.com.
If you’re new to using integration services within Azure Data Factory, you may notice at times it takes a bit longer for some of the packages to run than they would have on prem. Today I’ll share a couple simple and effective ways to help with the performance of those from experiences we’ve had.
Companies across the globe are moving their data to the cloud. How does the ETL developer fit into that picture? What are the options that you have available for loading and moving data located in Microsoft’s Azure cloud service? Can you continue to use SSIS or do you have to use Azure Data Factory? In this webinar by Mitchell Pearson, you are going to learn how you can use SSIS, a tool you’re already familiar with, to interact with common resources in Azure like Azure Blob storage and Azure SQL Database.
In yesterday’s post I introduced you to Azure Data Factory Version 2 (ADF V2) and the visual tools that were added. Today, I’d like to talk about developing and deploying SSIS packages in Azure Data Factory V2. This blog will be quite brief as if you’re using Visual Studio and SQL Server data tools for building your packages, not much changes.