Azure Databricks

How to Upload and Query a CSV File in Databricks

Welcome to another post in our Azure Every Day mini-series covering Databricks. Are you just starting out with Databricks and need to learn how to upload a CSV? In this post I’ll show you how to upload and query a file in Databricks. For a more detailed, step by step view, check out my video at the end of the post. Let’s get started!

Andie LetourneauHow to Upload and Query a CSV File in Databricks
Read More

How to Merge Data Using Change Data Capture in Databricks

My post today in our Azure Every Day Databricks mini-series is about Databricks Change Data Capture (CDC). A common use case for Change Data Capture is for customers looking to perform CDC from one or many sources into a set of Databricks Delta tables. The goal here is to merge these changes into Databricks Delta.

Jon BloomHow to Merge Data Using Change Data Capture in Databricks
Read More

Databricks and Azure Key Vault

In our ongoing Azure Databricks series within Azure Every Day, I’d like to discuss connecting Databricks to Azure Key Vault. If you’re unfamiliar, Azure Key Vault allows you to maintain and manage secrets, keys, and certificates, as well as sensitive information, which are stored within the Azure infrastructure.

Jon BloomDatabricks and Azure Key Vault
Read More

Custom Libraries in Databricks

This week’s Databricks post in our mini-series is focused on adding custom code libraries in Databricks. Databricks comes with many curated libraries that they have added into the runtime, so you don’t have to pull them in. There are installed libraries in Python, R, Java, and Scala which you can get in the release notes in the System Environment section of Databricks.

Jeff BurnsCustom Libraries in Databricks
Read More

How to Integrate Azure DevOps within Azure Databricks

In this post in our Databricks mini-series, I’d like to talk about integrating Azure DevOps within Azure Databricks. Databricks connects easily with DevOps and requires two primary things. First is a Git, which is how we store our notebooks so we can look back and see how things have changed. The next important feature is the DevOps pipeline. The pipeline allows you to deploy notebooks to different environments.

Jon BloomHow to Integrate Azure DevOps within Azure Databricks
Read More

How to Create an Azure Key Vault in Databricks

Welcome to another edition of our Azure Every Day mini-series on Databricks. In this post, I’ll walk you through creating a key vault and setting it up to work with Databricks. I’ve created a video demo where I will show you how to: set up a Key Vault, create a notebook, connect to a database, and run a query.

Leslie AndrewsHow to Create an Azure Key Vault in Databricks
Read More

Real-time Structured Streaming in Azure Databricks

Do you want to learn real-time Structured Streaming in Azure Databricks? In this recent webinar with Principal Consultant, Brian Steele, you’ll learn all about Structured Streaming, the main model for handling streaming datasets in Azure Databricks.

3CloudReal-time Structured Streaming in Azure Databricks
Read More

How to Connect Azure Databricks to an Azure Storage Account

In continuation with our Azure Every Day mini-series on Azure Databricks, I will be covering some key topics within Databricks such as Azure Key Vault, storage accounts, PowerPoint and DevOps. If you’re just starting out with Databricks, you may want to check out our previous posts on Databricks 101 and Getting Started with Azure Databricks. Today’s post is focused on accessing Azure Storage accounts.

Jon BloomHow to Connect Azure Databricks to an Azure Storage Account
Read More