In the world of modern technology, we can all understand the need for and importance of data in an organization. Data is what drives everything in a company, from payroll to production orders, sales and marketing key performance indicators (KPIs), customer contact info, and so much more.
The Challenge: Balancing Data Security with Quick Access to Data Sources
It’s because of how important data is that most organizations tightly restrict access to it. In most cases, this is for a good reason: keeping personal, proprietary, or otherwise sensitive data safe; maintaining compliance with regulatory requirements of statutes such as GDPR, CCPA, or HIPAA; or ensuring that there is an extremely low chance of incorrect or redundant data being created or introduced into the environment. Almost every company will limit direct access to their data to only those within their data storage/warehouse teams. While this seems smart in practice, there are times when it can create pain points for other teams – and none more so than the Business Intelligence (BI) team.
How does the BI team retrieve, manipulate, extend, analyze, and even delete data without compromising any of the security put in place by the warehouse team?
The BI team works closely with the data storage/warehouse teams, and often will have reader access to at least some of the organization’s data. Which means they can see the data, but not manipulate it – or pose any risk to altering it. The reason that becomes problematic for modern BI teams is that they often need less restrictive access to data sources in order to create impactful reports and analyses. BI teams are problem solvers who run what-if scenarios, gather feedback or supplementary information to combine with internal data sources, and process the entirety of this corpus to deliver relevant insights while eliminating outliers. Now as I said, data is everything to an organization, and the risks of shoddy data can be costly. So, how does a company empower their BI team without giving away the keys to the kingdom?
The Solution: a Dedicated BI Data Storage and ETL Environment
This is where Power BI and Power Automate come in. These tools are from the Microsoft Power Platform and while they each perform great functions on their own, they are meant to work together. While Power BI and Power Automate have great features within their desktop tools, for the remainder of this article, we’ll be writing under the assumption that a company has procured a Premium license for both tools, which allows a BI team to take full advantage of each tool’s cloud capabilities.
To start, let’s restate the main issue one more time: How does the BI team retrieve, manipulate, extend, analyze, and even delete data… without compromising any of the security put in place by the warehouse team? The answer, simply stated, is: by taking the data out of the corporate data structure and creating a separate BI Data Storage and ETL (Extract-Transform-Load) environment. This way, the BI team can host data that is refreshed and kept concurrent with the organizational data, but manipulate it without fear of altering the “source of truth” that lives in the company’s data warehouse. For this solution, Power BI Datasets would act as the Data Storage, Power Automate Data Flows would act as the Extract and Load processes, and using both together will allow for secure transformation of data.
Starting with Data Storage, Power BI Datasets are highly compressed data models that follow relational structures, just like in a data warehouse. They can be created in Power BI Desktop by using tabular tools such as SSDT or Tabular Editor, or even in Visual Studio. These Datasets can create and read connections to a company’s data and extract it into a new location, creating a compressed copy of the data without altering or compromising the source. Then, they can be deployed to a Power BI Premium Capacity server in your organization’s tenant. Once there, these Datasets sit as highly compressed Analysis Services models that can be accessed by users within your organization from the Premium space. This creates a basic data storage and sharing solution for a BI team.
Extract and Load:
Now, how do we keep our data up to date? Power BI Premium offers a scheduled refresh capability that can be run multiple times per day/week on a timer. However, there are many business cases that require more granular control over data refreshes. For these cases, you can access Power Automate to create Data Flows to your Power BI models. Power Automate has a plethora of connectors, triggers, and capabilities for acting on your data. Flows can be deployed through data driven alerts, advanced time schedules, or manual triggers. Power Automate can even create listeners for updates in your data, kicking off refreshes whenever it notices changes in the data sources. This can be very powerful for keeping your Power BI Datasets refreshed without over-querying your source data or using too much capacity on constantly refreshing stale data. Power Automate thus covers the Extract and Load parts of the ETL solution referenced above.
Using Power BI and Power Automate in conjunction, a modern-day BI team can maintain their own secure copy of organizational data, execute their own ETL processes, and supplement data that’s not being captured in the source.
Last, we have the Transformation of data. For this, it’s best to use Power BI together with Power Automate. Power BI can do great data transformation using its Power Query capabilities, or even create calculated columns and measures within your Power BI datasets. These are great ways to “systematically” update your data within the Power BI Datasets. However, Power BI does not offer as much when it comes to collecting or gathering new data. Power Automate, on the other hand, can add and delete from Power BI Datasets based on much more than just hard calculations. It can send surveys to clients and team members and collect their feedback, and check external sources of data from the internet, partner applications, or even social media and write it back to the Power BI Datasets. It can also perform conditional analysis that will check these external data sources and remove irrelevant or redundant data based on its findings. This means that the BI team can now generate new data for their Datasets that wasn’t in the source data.
With the help of Power BI and Power Automate in tandem, BI teams can maintain, manipulate, and update their own data without needing hands-on assistance from data storage/warehouse teams. Using Power BI and Power Automate in conjunction, a modern-day BI team can maintain their own secure copy of organizational data, execute their own ETL processes, and even supplement with data that’s not being captured in the source. The result is that even if an organization chooses to keep its data secured tightly, it won’t hinder or halt todays BI teams from being able to excel at any task given to them.
BlueGranite has strong experience deploying a mix of both Power BI and Power Automate in organizations of all sizes. If you’re wondering how Microsoft’s Power Platform could fit into your modern analytics journey, contact us today!