This Is How 1 Person Can Build A Power BI Setup That Runs On Autopilot
Save Time, Improve Quality & Empower Business
You don’t need an army of analytics professionals to run Power BI.
With the correct setup, you can compound the analytics knowledge in your organization and empower business to solve their own problems without making any compromises to security or quality. All while keeping the setup clean and controlled. In best case, it takes only one person to build the whole setup.
The key to everything is reusability. In a nutshell, everything runs on top of a reusable golden data model and the report creation process is outsourced to skilled business users with the help of best practice templates and other helpful support materials.
If you can check all the boxes described in this article, your analytics platform will be mostly running on autopilot. I know because mine is.
Let’s dive in.
Build the solution on top of a reusable golden data model
The foundation is to build the whole setup on top of a golden data model(s) 🥇.
Golden data model is a centralized semantic model that contains all core data business might need to solve a problem. The point is to reuse the same model as a source for all the reports you create using a live connection.
But why is this a good idea?
Once you’ve set up a high quality golden data model, the maintenance of it is mostly adding a new measure or column every now and then. You can get back all the time that goes into building half baked data models that only support a single report.
“If you can build a Pivot table in Excel, you’re skilled enough to use a golden data model to build a report”
Using a golden model also opens the door for a new layer of business users to take advantage of data. There’s no need for them to know SQL, PowerQuery, DAX or any of that stuff. Instead, business users are able to just explore a well organized & documented measure folder and drag and drop prebuilt measures into canvas to build visualizations.
A golden data model usually contains a lot of tables, which leads us to the next point.
Use dataflows as source for the golden data model
No matter which source you use, the data probably needs some transformations before it’s ready to be used.
But don’t do any Power Query transformations in the golden data model. Because you have a lot of tables & intermediate tables, it would make the reload of the final model slow and complex.
We don’t like slow and complex 😵💫.
Instead push all transformations to dataflows in Power BI service. Dataflow is nothing but a reusable table you can use as data source instead of directly connecting for example to a database. For maximum simplicity, I like to have just one table per dataflow. In doing this, you have more control to scheduled refresh settings and a better visibility to refresh statistics for individual tables.
I’ve found that this is the best way to make sure the data source you use contains the data exactly in the shape you need it in the final golden data model without any transformation needed.
Create 2 templates for report creation
If you want to make high quality report creation a breeze for business users, you need 2 pbix templates.
The first one is a empty report template file where all possible recurring housekeeping tasks are already taken care of 🧹. Here’s a few examples:
Already live connected to the golden data model
Your organizations color theme applied
Typical filters added to filter pane
Default background image applied
Report settings configured
Model view layouts created
Report creation best practice checklist in the first report page
I recommend helping users also by creating second template, a visualization examples file. This guarantees report creators never have to start from a blank page again.
Visualization examples file should contain some best practice examples how you might use the golden data model to build visualizations, which is especially useful for new users. Simply search for a visualization that might fit your requirement, copy paste that visual to the empty report template, and adjust as needed.
This helps a lot also in making sure formatting best practices are used, all while saving a lot of time for everybody.
Add professional touch by using a background image
Consistency is king 👑 when it comes the to the visual look of reports.
One key element of this is the report background image. In the previous part I already mentioned applying a default background image to the empty report template but obviously different reports require different background layouts.
For this reason you should create a reusable report background file. I do it in Power Point. In the first slide, include a simple checklist of how to use the file (Ctrl + A to select the layout, right click to save as svg, steps to to apply the background in Power BI Desktop). All the following slides should contain different examples of best practice layouts.
Using this simple file guarantees a professional look with consistent spacing to all reports, all while making sure the colors of the background image match to the color theme of the empty report template.
Consolidate all golden data model DAX measures in one file
Sooner or later the users will bump to the need to extend the golden data model with their own measures.
The only problem is they don’t know DAX. And they cannot see the DAX used in an existing measure when using a pbix that is live connected to a semantic model (hopefully this changes some point in future).
That’s a problem 🤔.
You could add the DAX formula to the description box in the model view, but I like to reserve that space instead for a clear business description of what the measure does.
Another alternative is to maintain a central file that contains all of the measures. Because business people don’t use Github, at least not in the planet where I live, a simple txt file will work just fine. Not optimal, but gets the job done effectively.
To use the txt file, simply search for a measure that you know does a similar thing you want to achieve. Then just copy paste DAX and adjust as needed. For example, a user might not know how to use DATEADD function to create a previous year version of a measure, but copying a DATEADD template from the txt file takes about 5 seconds, and a 5 year old can understand how to adjust it to make it work.
But where do you store this file?
Run the process from a centralized hub
All of the files and templates above are useless if people don’t know where to find them and how to use them 😭.
So to make it all come together you need to have a place where all of these files can be found, and make sure correct people have easy access to them.
Since you’re reading an article about Power BI, you probably have access to Microsoft Teams. A great alternative is to set up a Team for all the users that are creating reports. Even if you don’t use Teams, you probably have a clear idea what best works your organization.
But the support files are not the only thing that makes the platform run smooth. Here’s what I think the central hub should contain:
Support files — empty report template, visualization examples file, report background generator and the DAX measures txt file
Instructions — how to use the support files & publish reports.
FAQ — answers to frequently arising questions
Improvement requests — a process for adding new tables, columns, or measures to be added the golden data model.
Created reports — to share great report templates between users.
Concerned about saving all reports in a central location? 🙀
Since the golden data model contains correctly set up security rules and all reports are live connected, there is no risk in people seeing data they are not supposed to. But still it’s possible to copy the structure of a report since everybody is using the same data source. On top of that, the file size of live connected pbix files is next to nothing. Magic 🪄.
But how do you set up the workspaces and sharing?
Design optimal workspace & app structure
One of the biggest mistake you could do is to create a workspace and report chaos.
For that reason, I recommend a setup that is as simple as possible. Less is definitely more when it comes to workspace management. Here’s the workspace structure I use:
Golden data model(s) — Only the admin(s) of the data model should have access to this workspace. Do NOT give access to business report creators ⛔️.
Dataflows — Same as above, only the admin(s) of the data model have access to this workspace.
Test — You’ll want to run some tests, have a designated workspace for that. Up to you if you want to give also some business users access to this workspace.
Shared Reports — This workspace contains all the reports shared with end users. Report creators should have the member access to this workspace.
As you can see the access setup is rather restricted.
Only the admin(s) of the solution will have access to the golden data model and dataflow workspaces. While report creators wont have access to the workspace, they still have build access to the data model. For end users it’s enough to have just read access using a proper security role.
Report creators will only have access to shared reports (and test) workspace, but end users don’t have access to any workspace. That is not needed since in this setup all sharing of reports happens by creating an App out of the Shared Reports workspace.
Having all reports in this single workspace enables the effective use of dashboards later when you want to get an overview of multiple different reports. If possible, give access to App audiences using for example active directory groups in order to avoid admin work when you add new end users.
Hope this article managed to give you a clear overview of tried and tested Power BI setup.
If you’re interested to know more details on any of the parts, leave a comment and I’ll be sure to write about it in a future post.
Until next time!
I must laud and thank you for your blog - it is an absolute treasure trove of material on Power BI, which is the best I've come across thus far and a kicker on top that it's free. I'm the Head of Finance at my organization (~50 employees) and I'm in the midst of determining the optimal path to build out our PowerBI infrastructure. I have intermediate (depending on how you define that) experience in SQL, have used Power Query for years, and have created a PowerBI semantic model (with the properly linked fact & dimension tables) which is connected directly to our SQL server, which I've used to create a few reports. The next steps for me are honing my DAX skills and architecting how to share/publish the dashboards where the right people in the organization get and only have access to the information they need.
In this article, my understanding is you're suggesting developers upload the semantic model onto Office 365 in a workspace and pull from that semantic model onto P-BI desktop/web? If the semantic model is pulling data directly from a SQL database (or offline files), I presume security measures will need to be adjusted to enable data refreshes on O365? I hadn't encountered the idea of a 'golden data model' before if that gives you a sense of my current knowledge gaps. I will need to do some learning on this and on Dataflows - would appreciate any supplemental writing you craft on these topics. I haven't done much trial and error thus far concerning report/dashboard deployment so recommendations of options are appreciated.
Hope this substack picks up some steam as you continue writing, it is an absolute gem.