38 years behind the keyboard, started with PC-DOS and BASIC, at age 12, when not solving the Rubik’s cube. Lead Consultant, architect, online instructor, husband, enjoy nature and our golden retrievers.
My post today in our Azure Every Day Databricks mini-series is about Databricks Change Data Capture (CDC). A common use case for Change Data Capture is for customers looking to perform CDC from one or many sources into a set of Databricks Delta tables. The goal here is to merge these changes into Databricks Delta.
In our ongoing Azure Databricks series within Azure Every Day, I’d like to discuss connecting Databricks to Azure Key Vault. If you’re unfamiliar, Azure Key Vault allows you to maintain and manage secrets, keys, and certificates, as well as sensitive information, which are stored within the Azure infrastructure.
In this post in our Databricks mini-series, I’d like to talk about integrating Azure DevOps within Azure Databricks. Databricks connects easily with DevOps and requires two primary things. First is a Git, which is how we store our notebooks so we can look back and see how things have changed. The next important feature is the DevOps pipeline. The pipeline allows you to deploy notebooks to different environments.
In continuation with our Azure Every Day mini-series on Azure Databricks, I will be covering some key topics within Databricks such as Azure Key Vault, storage accounts, PowerPoint and DevOps. If you’re just starting out with Databricks, you may want to check out our previous posts on Databricks 101 and Getting Started with Azure Databricks. Today’s post is focused on accessing Azure Storage accounts.
Data security is of utmost importance for all organizations, and can be ensured with proper data governance policies. In an introduction to data governance, there’s so much to cover, so I’ve split this Azure Every Day blog/video into 2 parts. Let’s start with the basics.
Did you know that the Azure Portal has a feature to connect using PowerShell in the cloud? With this there’s no need to connect using local PowerShell; simply connect, authenticate and start running commands on your Azure site. In this post, I’ll demo how to use this nice feature.
If you’re working with Azure, hopefully you’re already taking advantage of Azure Analysis Services. There are many benefits here including; scale resources to match your business needs, easily visualize your data using your favorite data visualization tool (like Power BI), as well as govern, deploy, test and deliver your BI solution with confidence.