In continuation with our Azure Every Day mini-series on Azure Databricks, I will be covering some key topics within Databricks such as Azure Key Vault, storage accounts, PowerPoint and DevOps. If you’re just starting out with Databricks, you may want to check out our previous posts on Databricks 101 and Getting Started with Azure Databricks. Today’s post is focused on accessing Azure Storage accounts.
Want to learn more about event driven ELT? Extract, Load, Transform (ELT) is a process where data is extracted for the source, then loaded into a staging table in the database, transforming it where it sits in the database and then loading it into the target database or data warehouse. In a recent webinar, Principal Consultant Michael French, gives a practical demonstration of how to move data from Azure Blob Storage to an Azure SQL Database using Azure Data Factory and Logic Apps.
I’m excited to tell you more about the preview of the second generation of Azure Data Lake Store. One reason I’m excited about this preview is we often get asked about whether to use Data Lake Store or Blob storage for storing files – maybe in a data warehouse load scenario for instance where we use file storage as part of the pattern.
In today’s post I’d like to talk about what WORM storage is and how it can help with compliance and security. With the recently added WORM storage in Azure, Microsoft supports immutable storage with their blob storage accounts, allowing various regulated industries and legal situations to be properly supported in Azure.
In today’s post I’ll look at some considerations for choosing to use Azure Blob Storage or Azure Data Lake Store when processing data to be loaded into a data warehouse. My basis here is a reference architecture that Microsoft published, see diagram below.
There’s a lot of talk about storage options. When working with customers, I often introduce them to Azure through storage, since it’s a great way to leverage cloud assets in a positive, non-threatening way.