Azure Every Day

How to Import Power BI Datasets

I’m here to talk about Power BI datasets. Currently, when you connect to a Power BI dataset in the Power BI Service, it will be in live connect mode, which limits what you can do with that data model. I’d like to tell you about leveraging existing Power BI datasets in a way that is not supported out of the box at this time.

In that live connect mode, you’ll be able to add additional measures, but you can’t add calculated columns or external data sources like when using a traditional Analysis Services model. Below is a workaround that you can use to meet ad hoc reporting needs, where you can leverage existing data models for users that do not have edit permissions to the underlying dataset itself or to the data source.

  • The first step in leveraging an existing Power BI dataset is to pull it into Excel. This allows us to flatten out the data, de-normalize it within an Excel Pivot Table and use that table as a data source for Power BI.
  • If we want to pull that data back into Power BI and add additional data, we start by analyzing in Excel and that will download a file for us.
  • Next, we can open our data connected Excel file. As I open this, it will be live connected to our data model. I can then generate a Pivot Table to flatten out the data that I’m interested in. In my demo, I bring in the total sales and a couple dimensions from the sales territory like sales territory country and sales territory group.
  • I also want to restrict this by a period of time, so I’ll bring in the calendar year. For this dataset, the last year of data is 2008 but that is only a partial year. I’ll bring in all of 2007 and I need to be sure I bring in enough granularity so I can see that 2008 is only a partial year. I also bring in the month.
  • At this point I have a dataset that is de-normalized but not Power BI friendly yet. I will need to flatten it out into a simple table.
  • To flatten it, I’ll go into the Pivot Table design and turn off my subtotals first and then my grand totals. Under Report Layout, I click on Show in Tabular Form and then click Repeat all Items and Labels.
  • Now, I have a table that is Power BI friendly. This way I don’t have to do a lot of manipulation to work with it.
  • The final step back in Power BI Desktop is to connect and import the Pivot Table that I created by using the Excel Connector. Then by pointing to the file that I just saved, I’m able to access the Pivot Table that is connected to our Power BI dataset.
  • The way that I formatted the Pivot Table allows us to see the data in a nice, clean table. But keep in mind that as I’m loading this data, it is not pulling in a fresh copy of information from the Power BI Service directly. Instead, it’s pulling in the static copy that exists in that Power BI Pivot Table.
  • So, I can’t refresh the data from here. If I want to get an updated set of data from the Power BI Service, I will have to go back to the Excel Pivot Table that is connected to the Power BI dataset and refresh it from there. Once I hit refresh, it will bring in an updated copy of that information via the Excel Pivot Table.
  • However, the main reason I’ve done this is to do some ad hoc data modeling and reporting off that existing Power BI dataset. I can also import additional data elements from SQL Server or bring in additional Excel Pivot Tables and join that data to this ad hoc model to meet my immediate reporting needs.

This simple work around gives you some flexibility in being able to leverage your existing Power BI datasets. In the future, I believe this type of work around will be unnecessary as we’ll be able to import existing Power BI datasets and add other elements to it natively out of the box. But until then, this workaround will allow you to get more out of what you have already built.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

 

 

Jeremy BlackHow to Import Power BI Datasets
Read More

A Success Story Using Auto ML in Power BI Premium

Azure Machine Learning and Power BI are two of the most influential tools out there. I’d like to share a story about how a colleague and I recently completed a project for a public school district in Georgia using these powerful tools.

The purpose of this project was to integrate machine learning and predictive analytics to be able to feed inputs that existed in their data about students such as classes they’d taken, attendance, grades, etc. and use this information to predict the likelihood of high school graduation. The district could then use this to proactively guide students and give them the help they needed.

We used the Auto ML feature in Power BI Premium. My colleague, Brian Custer, oversaw the machine learning model design. Auto ML is a straightforward process in Power BI. All that was needed was to create data using the information we wanted to look at that may have an impact on graduation and then we put this through a wizard in Auto ML. Once we did that, out came a model that was trained and ready to go for prediction.

Machine learning models exist in several different forms and leverage the capabilities that are built into Azure – we have Azure Machine Learning, Python and R, integrations with SQL Server, as well as Auto ML and other features within Power BI.

So, what exactly is Auto ML in Power BI? It’s a wizard driven machine learning interface that works off dataflows. You start off with a dataflow and you must have your training and prediction entities in your dataflow. This is passed to an Auto ML model that you choose and then it automatically trains the model for you and delivers predictions.

This is a very simple pipeline to set up. In our case we simply hooked up to the data that was in Azure Data Lake Gen2 storage and brought it into some SQL tables.

The school district we created this for couldn’t have been happier with the results and they are currently using this machine learning model to help predict the probability of high school graduation and ensure that students in the path of not graduating get the help they need to succeed.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Paul TurleyA Success Story Using Auto ML in Power BI Premium
Read More

A Look at Azure Synapse Studio (in Preview)

Azure Synapse, formerly SQL DW, is an analytics service that gives you the ability to query data on your terms and at scale. It brings together enterprise data warehousing and big data analytics. Azure Synapse brings together serverless on-demand or provisioned resources with a unified experience so you can ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs.

In this post I’ll show you a new Azure Synapse feature in preview called Azure Synapse Studio that I’m excited about. The Studio is a one-stop shop for working with data of any size, but particularly big data. You can ingest data, you can explore and analyze data that you already have in the workspace, as well as visualize data using Power BI.

In my video demo included in this post, I’ll walk you through the Azure Synapse Studio and show you what you can do with it. On the Home Page, you can click on the “New” button and see all you can do such as, create a new SQL script which can be executed against your SQL on-demand or another SQL Database that you have connected to your Studio’s workspace or create a Jupiter notebook which you can run in either a SQL or Spark context, along with creating dataflows, pipelines and Power BI reports.

In my demo I’ll dig into the Studio and show you how to:

  • create a notebook, ingest data, explore your data by either working in a notebook or a pipeline.
  • connect to external data like an Azure Cosmos DB or Azure Data Lake Storage Gen 2 instance.
  • create a new database with the Manage feature in which you can create a new SQL pool or an Apache Spark pool.
  • create Linked Services to connect to your Data Factory pipelines and instances to work within your databases

There are so many great things you can do with the Azure Synapse Studio and I think it will be very helpful to many people from data engineers to data scientists and business analysts, allowing them to work together in a single workspace without having to flip back and forth between various tools. I highly suggest you sign up for this preview feature and give it a try.


Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Brian CusterA Look at Azure Synapse Studio (in Preview)
Read More

How to Add Menus to Screens in Your Power App

Power Apps is the incredible tool by Microsoft in which citizen developers can quickly build their own custom business application. If you’re just starting out with Power Apps, I’d like to share some knowledge about adding Power Apps navigation menus in order to help you make the best use of available real estate on the screen when developing a Power App.

In my video included at the end of this post, I’ll demo using the on-demand navigation drop down menu, be sure to watch this for more a more detailed view.

  • I’ve created an app to showcase my child’s artwork and I want to separate the art by medium (clay, paint, pencil), so I’ll have multiple screens.
  • On each screen, I want to add a menu. There are 4 components on this screen that make up these menus:
    • The Hamburger Icon controls the visual properties of the other items.
    • Arrow icons act as links to the other screens.
    • Labels to indicate what the links are for.
    • The right dangle icon acts as a background and ties it all together.
  • You access these components in the Icons tab/dropdown menu at the top. I’ll begin by adding a Hamburger Icon by clicking on Hamburger Icon from the dropdown menu.
  • Once we add it and see it on our screen, we can change the properties of our Hamburger, like position and size, in the Icon menu on the right side of the screen. We can also change the color and the hover fill, so we get the highlight when we hover over the icon.
  • We can also add a rectangle icon which acts as our menu on the page and can be used to hold the links to the other pages. We may not always want this icon/menu to show and we can control that with a variable. Check out my demo for more detail on this.
  • To add the links to the other screens, we first want to click on the Label tab to add a label. The first label I want is a link to the main screen. Use the Label menu on the right side to set your properties such as text (I put main for my demo to tell this link will go back to the main page) and change the position and size. For the link for this label, I want enough room next to my text to show my arrow icon on the left. I do this with the Padding Property, I set this to the left of the text, which moves my text to the center and I now have room to add an arrow icon.
  • We need to have the text and the icon as the label will not change your mouse when you hover over it, but the icon does. The text only tells the user where they are going with the arrow.
  • We want the mouse not only to change into a pointing finger when we hover, but we want it to be highlighted, so when we’re going through our links on the menu each one will be highlighted. We do this with by using the hover fill for the arrow icon.
  • To make this link go to the main screen, we change the On Select for that arrow icon; we’ll need to change the code to navigate to the main screen. Also, when we leave that rectangle icon screen (or pencil menu), we want to tidy up by hiding that menu. We do this by updating the context of our pencil menu and changing our visible properties. (See code in my demo.)

I hope this post was helpful in learning how to add menus to multiple screens on our app, as well as how to hide and show them to maximize your real estate on your screen.

 


Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Jim BentonHow to Add Menus to Screens in Your Power App
Read More

Testing Row-Level Security in the Power BI Service

In my last post, I showed how to test row-level security in Power BI Desktop. In this follow up, you’ll learn how to test row-level security in the Power BI Service once you’ve published your file there. Also, please check out my demo at the end of this post.

  • After navigating to the workspace where I published my file, I need to find the data set and select Security from the option menu, as row-level security resides at the data set level.
  • On the next screen, you should see the role that was created in Power BI Desktop, along with a field to enter your role membership, in other words, people or groups who belong to this role.
  • You will see the functionality to test row-level security only when you hover the mouse over the role and get the ellipsis. Click on this ellipsis and you’ll get the option that says, ‘test as role’.
  • This will take us to the report page where it appears that row-level security is immediately applied. I know this because I see that Power BI is taking my login credentials and applying it to the report which matches what I saw when I was testing this in Power BI Desktop.
  • If I want to change the user that I’m testing, I can enter a new email by clicking on the menu at the top.
  • One important thing to clarify:
    • Once a PBI file is published to the service, there are other considerations that can impact if row-level security works as expected, such as workspace membership roles, data set permissions and data set security role memberships.
    • For example, if we look at our workspace membership, we see that we have 3 report users with 3 different roles. Row-level security only applies to users in a viewer role, not to members, admins, or contributors.
    • In my example, I am admin of the report, so if I return to the report page and type in my email, I have access everything and the actual result is returned. This also applies to contributors and if I type in their email, they will get the same result with access to all the data, just as I did as an admin.
    • If I type in the email of a viewer, I expect to see his information filtered on the page, but instead a get a bunch of errors. This is because row-level security applies to viewers since they do not have a role membership, so Power BI won’t let them see the data.
    • To give a viewer access, you’ll need to go back to the row-level security page and with the role selected, we can enter the viewer’s email in the Members field, then click add and save.
    • A best practice here is to only use security groups when you’re creating role membership; it’s easier than explicitly entering names and is much easier to maintain.
    • To test if this worked, I simply click on test this role, go back to my report page and enter the name of the viewer whom I just added, and the page should filter to what the view can now see.
  • What happens when role testing doesn’t work correctly? Check out my graphic below which gives 3 examples of errors returned and the reasons why role testing may not work.

I hope this post on testing row-level security in the Power BI Service was helpful. If you have questions or want to know more about row-level security in Power BI, about Power BI in general or any Azure product or service, let us know – our team of experts are here to help.


Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

 

Steve WiseTesting Row-Level Security in the Power BI Service
Read More

How to Upload and Query a CSV File in Databricks

Welcome to another post in our Azure Every Day mini-series covering Databricks. Are you just starting out with Databricks and need to learn how to upload a CSV? In this post I’ll show you how to upload and query a file in Databricks. For a more detailed, step by step view, check out my video at the end of the post. Let’s get started!

Andie LetourneauHow to Upload and Query a CSV File in Databricks
Read More

How to Merge Data Using Change Data Capture in Databricks

My post today in our Azure Every Day Databricks mini-series is about Databricks Change Data Capture (CDC). A common use case for Change Data Capture is for customers looking to perform CDC from one or many sources into a set of Databricks Delta tables. The goal here is to merge these changes into Databricks Delta.

Jon BloomHow to Merge Data Using Change Data Capture in Databricks
Read More

Databricks and Azure Key Vault

In our ongoing Azure Databricks series within Azure Every Day, I’d like to discuss connecting Databricks to Azure Key Vault. If you’re unfamiliar, Azure Key Vault allows you to maintain and manage secrets, keys, and certificates, as well as sensitive information, which are stored within the Azure infrastructure.

Jon BloomDatabricks and Azure Key Vault
Read More

Custom Libraries in Databricks

This week’s Databricks post in our mini-series is focused on adding custom code libraries in Databricks. Databricks comes with many curated libraries that they have added into the runtime, so you don’t have to pull them in. There are installed libraries in Python, R, Java, and Scala which you can get in the release notes in the System Environment section of Databricks.

Jeff BurnsCustom Libraries in Databricks
Read More