Jon Bloom

38 years behind the keyboard, started with PC-DOS and BASIC, at age 12, when not solving the Rubik’s cube. Lead Consultant, architect, online instructor, husband, enjoy nature and our golden retrievers.

Azure SQL Edge

What do you know about Azure SQL Edge? This relational database engine has all the goodness we’re used to in SQL Server but is geared for IoT and IoT Edge deployments. It gives you the capabilities to create a high-performance data storage and processing layer for IoT applications and apps.
Data is constantly being captured by IoT devices. Many industries, including oil and gas, mining, and even farming, are adopting massive, complex machines with tons of sensors that collect data. That data is collected, sorted through, and used in a variety of ways to assist them in their operations. This is all IoT in action.

However, there are issues: asynchronous communications are required due to intermittent network access, gaps in time, missing values with IoT time series data, data retention, and management (small devices = little storage).
In this video, I dive into more about Azure SQL Edge and how it can help with common challenges. Topics I cover are:
• Typical Azure SQL Edge Workloads – Data, Machine Learning, and Analytics
• Data Lifecycle in the Edge
• Benefits of SQL Edge including near real-time response and reduced IoT costs
• I’ll also discuss how Azure SQL Edge is NOT SQL Server – I point out the things you won’t get but you’re used to in SQL Server, as well as tell you what you do get from SQL Edge for your IoT data.
The last part of my video is spent on a demo of Azure SQL Edge, so be sure to check that out.

In summary, Azure SQL Edge is built for IoT workloads directly on the device and it runs on Linux. There are many good resources out there to help you get started, and although it’s not the same as SQL Server, it is very comfortable and familiar to those of us who have been on SQL Server. I suggest you give it a try for your IoT solutions and apps.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or [email protected]

Jon BloomAzure SQL Edge
Read More

Azure Data Modern Platform

Are you using your data as an asset to make predictions about your business? Some challenges that many businesses face are, large volumes of incoming data which they are unable to process and derive insights from, and data silos that make it difficult to find and share data as it’s spread across multiple sources within the organization, as well as slow adoption of AI/ML and real-time analytics. A Modern Data Platform in Azure can help overcome these challenges.

First, let’s look at how we got here from the Microsoft business intelligence we’ve had for over 20 years. In the past, most products were on premises, using SQL Server for our storage and ELT, master data, and data quality, and static reporting with SQL Server Reporting Services (SSRS). Security was in multiple places and change management was manual with much room for errors. Our data dictionary consisted of tables and field types, typically in Excel, and got outdated quickly and was hard to find as it was buried somewhere on the network.

Now in the cloud, the transition is surely for the better.
• Storage – we have a variety of options including Azure SQL Database, Azure Synapse, blob storage and Cosmos DB.
• ETL – we now use Azure Data Factory or Databricks.
• Delivery – reports are in Power BI in the form of dashboards and SSRS reports are being converted to paginated reports.
• Master Data and Data Quality – we have a variety of 3rd party vendors we are partnered with.
• Security – we can leverage Azure Key Vault, Azure RBAC which consists of roles assigned to people and groups, and service principles for some of our automation.
• Change Management – Azure DevOps boards enable us to set up tasks in sprints of time to bite off a chunk of work to complete, then update the ticket when complete, test, and move to production.
• Source Code Repository – Azure DevOps Git.
• Data Dictionary, Data Lineage, and Data Sensitivity – we have the new Azure Purview and this fills in a lot of gaps we had in the past.

If we look at the analytical maturity curve. (Please see the chart outlining this in my video.) We began with basic, static reports with limited Ad hoc and a single version of the truth and mainly created by IT. Now we’ve moved to self-service reporting with large ad hoc and we’re not dependent on IT for report creation. With Power BI we have empowered the user to build their own reports to investigate the data and gain insights to run the business, increase sales, streamline processes, and lower costs.
Predictive analytics is where we form models and allow machine learning to process the data and look for patterns over time. We can then move through the analytical maturity curve to real-time analytics and finally, real-time predictive modeling and mining which is event-based and allows for data discovery.

As you can see, we’ve come a long way over the years and a Modern Data Platform in Azure gives you a solid platform in which to run your business in the cloud and to better manage your data. Throw in data governance and Azure can handle most, if not all, of your data needs.

 

If you want to learn more about an Azure Modern Data Platform for your business, our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or [email protected].

Jon BloomAzure Data Modern Platform
Read More

Azure Spring Cloud

Have you heard of Azure Spring Cloud? Azure Spring Cloud is a fully managed service for Spring Boot apps that allows you to focus on building apps that run your business without having to manage infrastructure. It allows you to use the Spring Boot language, which is Java-based, to the cloud using microservices.

In this post, I’ll walk through how to get started with Spring Cloud to deploy a simple Azure Spring Cloud microservices application. You can also check out the video I’ve included to see my demo in action or follow the quick start document found in Microsoft Docs.

  • First, be sure to check out the list of prerequisites that you’ll need to have to get started.
  • You’ll start in the Spring Intializr where you’ll enter in your parameters and generate the code. It will download a zip file which you’ll use later in the project. You can find several cloud examples for Spring Cloud on the GitHub site.
  • Next, we must install a couple of things, the Spring Framework, and the Azure Toolkit for IntelliJ. IntelliJ is a tool that allows you to write the code in the IDE. There is a Community Edition and the Ultimate Edition. I used the Community edition, but the Ultimate has a 30-day trial. We also have to install the Azure CLI and Apache Groovy.
  • Now, let’s talk about the IDE in the environment. This IntelliJ environment is where you can create new projects, hook it up to Git, as well as many other things.
  • For my project, I start in IntelliJ and I created the shell by clicking on File>New and then create a project and I created the structure. I ingested that zip file that I go earlier in the Spring Initializr. We’ll see the source files it generated for us which is a Java file.
  • Configuration setting: There is an icon on the top right and when you click on it, this is where you’ll specify different settings. One thing is the project SDK which you’ll need a version of Java. After trying a few different versions, I had the best luck with version 16. Remember, you also must set your environment variables and your classpath and such so that it can talk to Java

1.  Next, we set our modules. You can set the source and resources. I can also run a test and set files that I want to exclude.
2.  We can also add libraries. If I go back to the Spring Initialzr screen, you can see I set this up to be a Maven project using the Java language. In the project settings, we have a tab for artifacts, and in platform settings, we have a tab for SDKs and global libraries

  • Back to our application build. The next thing we do is to build the application and after a few tries, we were able to compile our Spring Cloud app.
  • Another feature we have is the ability to log into Azure, select our subscription, and in my case, I used my Visual Studio MPN setting. The next step is deploying my app.
  • To do this I click on Run and then create a configuration file. It already pre-populated the artifact for me but you need to specify your subscription and then run it. It only took a few minutes and mine ran without errors.
  • So, we created an Azure Spring Cloud, and I added the log analytics by default and the application insights. If I click on the Azure Spring Cloud I created, it takes me to another page where I can click on Apps and there’s my app I created. I can click on that to see that it’s up and running, so it was successful end to end.

Why would you want to use Azure Spring Cloud? We would use Java to make microservices to connect to a variety of different sources on the Azure site, for example, we can do Cosmos, HDInsight, SQL Server Big Data Cluster, and Azure Synapse.

The nice thing here is that some of this code pre-written for you. So, if you’re familiar with Java, it does much of the heavy lifting with just a few lines of code. I highly recommend it and I’m looking forward to using this on an upcoming project.


Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Jon BloomAzure Spring Cloud
Read More

How to Merge Data Using Change Data Capture in Databricks

My post today in our Azure Every Day Databricks mini-series is about Databricks Change Data Capture (CDC). A common use case for Change Data Capture is for customers looking to perform CDC from one or many sources into a set of Databricks Delta tables. The goal here is to merge these changes into Databricks Delta.

Jon BloomHow to Merge Data Using Change Data Capture in Databricks
Read More

Databricks and Azure Key Vault

In our ongoing Azure Databricks series within Azure Every Day, I’d like to discuss connecting Databricks to Azure Key Vault. If you’re unfamiliar, Azure Key Vault allows you to maintain and manage secrets, keys, and certificates, as well as sensitive information, which are stored within the Azure infrastructure.

Jon BloomDatabricks and Azure Key Vault
Read More

How to Integrate Azure DevOps within Azure Databricks

In this post in our Databricks mini-series, I’d like to talk about integrating Azure DevOps within Azure Databricks. Databricks connects easily with DevOps and requires two primary things. First is a Git, which is how we store our notebooks so we can look back and see how things have changed. The next important feature is the DevOps pipeline. The pipeline allows you to deploy notebooks to different environments.

Jon BloomHow to Integrate Azure DevOps within Azure Databricks
Read More

How to Connect Azure Databricks to an Azure Storage Account

In continuation with our Azure Every Day mini-series on Azure Databricks, I will be covering some key topics within Databricks such as Azure Key Vault, storage accounts, PowerPoint and DevOps. If you’re just starting out with Databricks, you may want to check out our previous posts on Databricks 101 and Getting Started with Azure Databricks. Today’s post is focused on accessing Azure Storage accounts.

Jon BloomHow to Connect Azure Databricks to an Azure Storage Account
Read More

An Introduction to Data Governance (Part 1 of 2)

Data security is of utmost importance for all organizations, and can be ensured with proper data governance policies. In an introduction to data governance, there’s so much to cover, so I’ve split this Azure Every Day blog/video into 2 parts. Let’s start with the basics.

Jon BloomAn Introduction to Data Governance (Part 1 of 2)
Read More