Hi my name is Kevin Webb and I’m a Solutions Architect in the Data and Analytics practice here at 3Cloud. In this video, we’re going to take a look at the real-time analytics feature of Microsoft Fabric and using event streaming with KQL. Okay, so let’s jump right in.
So I positioned myself here in Microsoft Fabric on the real-time analytics workload. And the first thing we’re going to do is create a KQL database that will store our data. So we’ll give this a name and what we’re going to be using for our example data today is some sample data from Microsoft around New York City taxi trips and that takes a minute or soda provision. And once it’s done, we get a screen that looks like this.
Our database is created and one additional thing we have to do is activate this database to be available in one lake. We do that by flipping this switch here and we’re all set with our database and the next thing we need to do is create an event stream. So we’re going to set the database to this side for a second going to switch over to this screen here and we’re going to create an event stream. I’ll give this a name and click create and that’ll take a few minutes to fully provision as well. And now that that’s complete, we’re presented with our Event Stream Design surface and the first thing we need to do is create a source. So our source for this is going to be sample data from Microsoft around the taxi trips you can see we have some other options here. Azure Event Hubs. Its actually something I did in one of my other videos, so feel free to check that out as well.
So we’re going to check sample data. Just give this a name and we’re going to use Yellow Taxi sample data and we can see this immediately starts streaming. And if we click on the data preview, we can see data is coming in already. So this is part of the service. Some of the sample data that’s provided with Fabric.
So now we’re going to turn our attention to the destination and in this scenario we’re going to choose KQL database. We’re going to point it at that KQL database that we’ve already created. We’ll call this taxi trips DB and I’m going to choose the workspace where I created that KQL database. That’s in web demo. Once I choose that, I will see any of the KQL databases that I have in net workspace, so I’ll choose that and we’ll click, create and configure. And now since I don’t have any tables yet in my KQL database, we’re going to ingest this data into a new table.
We’re going to give this a table name and we’re going to click on source and review these defaults here. Our data connection name. We can, if we have some standards or something like that, we can change that, but otherwise I’m going to leave that alone. And the first thing that we’re going to notice here when we get some preview data flowing in here is it defaults to a text format, but we can obviously see this data as in JSON format, so we’re going to fix the data format first. We’re going to say JSON, and we can see that that’s formatted correctly now, and we can do things like clean up some data types if we need to. So let’s say we wanted to change vendor ID from along to an INT. We can do that here, so we’ll click on summary and then we can see that we’re immediately ingesting into our database so we can see that’s ING suggesting. And if we click on the destination and click data preview here and then refresh, we’ll be able to see the data that’s coming into the KQL database now.
Now that we have our event stream set up, we’re going to go back to our KQL database and we’ll refresh and we can see we got our table here now and one of the things we can do is start taking a look at some of our data in here. We can say query table and we’ve got some suggestions for some sample queries that we can do. So we can hit this show any 100 rows and we can see definitely we got data in our table, so we verified that our streamings working.
We’ve got data in our table now and let’s take a quick look and see how many rows we have in the table right now, so we can show total count of rows we can see we already got 100 and 3,000 rows streamed into the table. So now we’re going to take a look at how to create something called a query set in our KQL database and we do that by navigating back here to our homage for the database. Click on New related item KQL Query Set. Give it a quick name and then we’ll be put into an editor here. So I’m going to blank out the Template and Paste in some code, that I already had prepared.
We can click on a statement so we can have multiple statements in here. We can click on a query hit run, and it’ll run just that statement that one real quick does. A summarization on count of our trips by location ID. We can also use this to create a chart based on our data and look at that as a graph or a table so pretty flexible environment to do some data analysis. So we’re going to finish this up by showing how easy it is to take the results of a query in KQL and create a Power BI report out of it. So I’m going to run this query that I have prepared here, so we can see what this query is. Returning is our trip count by location, and we’re going to build a report off of this, so I’m going to click on Build Power BI report while I’m positioned here, that’s going to bring up a preview report for me so I can build it. I’m going to use a map visual here, and I’m going to pick some columns to put in to my report.
Put the count in there from the bubble size and we’ll fix these latitude and longitude because we don’t want them to be summarized. And once we do that, we can start to see our report develop. I’m also going to throw a stacked bar chart on there, and we’ll cross-reference the counts against the borough, and there’s our report pretty quickly put together. I’m going to go ahead and save this.
In the workspace that I’m working in. So now I’ve got a report. I’ll going into my workspace and navigate to it. There we have our full-blown report and that’s it for the demo. I hope you find this useful, and if you have any questions on Microsoft Fabric or anything at all with Azure, please contact us. We’d love to hear from you and thanks for listening.