Hi my name is Kevin Webb and I’m a Solutions Architect in the Data and Analytics practice here at 3Cloud. In our video today, we’re going to cover the event streaming feature that can be found in the real-time analytics workload that’s inside of Microsoft Fabric.
In this demo, we’re going to highlight a no code experience in Fabric that’s going to create an event stream, capturing event data from an Azure event hub, processing that data and landing that data in a lake house delta table. So before we jump in, there are a few items that have already been created to save time for the demo. One is an Azure event hub that will serve as a source for our event stream, and secondly a Fabric lakehouse that will be the destination for the data.
Please check out our other videos for more details on how to create lakehouses on Fabric.
Okay so I switched over to the Fabric environment and I positioned myself on a workspace that are prepared for this demonstration and in this workspace we already have a lakehouse that we’re going to use for our weather data demo, and if you’ve seen some of our other videos, you know that a lakehouse when you create one. You also get a SQL endpoint and a default Power BI data set that is automatically provisioned when you create a lakehouse now the other item that I’ve created is an event hub so switched over to the Azure portal here just to show that we have this event of created and we’ll be referring to this later in the demonstration when we need to generate some test data for our event stream.
So the first thing that we need to do is create the event stream and we can do that a couple of different ways. So from my workspace view here I can click on the new button and click on event stream and create one that way. I can also use the workload switcher here and switch over to a real-time analytics, and it’ll take me to a real-time analytics kind of home where I can click on event stream there to create one, and alternatively I can use the create hub. Now the create will show me all the different workloads and things that we can create within Fabric, and if I scroll down and find real-time analytics. I can find event stream in there.
So I’m going to go ahead and click on bench stream to create one. We’re going to call this ES Weather Data Demo. We’ll click create and then I’ll start provisioning our event stream and this will take a few minutes.
Ok so once our event stream is done provisioning, we’re now on the event stream design surface here and we’re given a default view of a new source that we need to create as the source of our stream and the destination, a place where the stream data is going to go and this is where we do our work to set up those two components. Now, the first thing that we need to do is define the source for our event stream. We click on new source, get a dropdown, a couple of different options and we’re going to pick Azure Event Hubs because we already have this set up. So we have a couple properties to fill in here and we give it a source name, a connection to our Event Hub data format and a consumer group from our Event Hub. So we’re going to call this Weather Data to give it a name. Now I already have a connection to my Weather Data Event HU set up and the components of that connection is either a service principal or some kind of credential that will give you access to the Event Hub. My data format is JSON that I’m going to use for the Event Hub, and the consumer group from my Event Hub which is filled in here for me is the default consumer group I’m going to ahead and create that.
It takes a few seconds, and now I have my Source now once I create the source it takes a few minutes for it to provision the connection and the communication between the Event Hub. Eventually you will see this icon here that says Streaming is configured and the status is successful for the Source.
So the next thing we’re going to do is set up our destination. I’m going to click on new destination, have a couple of options here. I’m going to pick a lakehouse and we have to fill in some properties for our destination. We’re recall this LH Weather Data and that’s just a name for our destination. I’m going to select the workspace that I’m in, which is web underscore demo and that’s going to fill in the lakehouse options from that workspace. And this is the lakehouse. I’m going to use LH Weather Demo and then we need to specify a delta table. I don’t have any existing tables yet, so I’m going to use a new table option here. So I’m going to type in weather data and then I have to click this, add weather data as a new delta table. Now this data format may be a little confusing. It’s not the data format of the table in the lakehouse because that’s going to be a delta table. This is the format of what’s coming in, so that has to match what the event hub is producing. And in this case, it’s JSON and we’re going to go ahead and create that.
That takes a minute or or two to provision and we’ll eventually see that it’s ingesting into our lakehouse, and our status is successful. Now that I have the Event Stream components set up, we’re going to need to generate some data to test our Event Stream. So this is where the Event Hub comes in that we created earlier. I’m going to use a cool feature of Event Hubs called Generate Data, or I can go in and choose a predefined DANTAS set for what My Weather Data here, which creates a sample payload that I can then send as messages to my Event Hub. Click. Send in. I’ve got ten messages that I’ve sent to my Event Hub that I can consume in my Event Stream.
I’m going to switch back over now to my Fabric view and see what happened on my event stream after I sent those messages. So, if I click on my source and go to data preview. I can see that here’s the incoming data that I had put on my event home. And then if I look over on my destination and go to the same view, you preview here’s my data as it is in my delta table. So now I want to just verify that it’s in my delta table on my lakehouse. So I’m going to go over to my lake house. I’m going to go to the SQL endpoint here so I can take a look at my table. There’s my table Weather Data, so my process created it. Here’s my data preview of that and I can see that this data has been successfully put into my lakehouse table and that’s it. So, I hope you’ve found this video useful and if you need any assistance at all with Fabric and all things Azure, please reach out to us today.