Setting up Power BI Version Control with Azure Dev Ops
In this blog post is a way set up version control for Power BI semantic models (and reports) using the PBIP (Power BI Project) format, Azure DevOps (Azure Repos), and VS Code. This approach treats your semantic model as readable text files (JSON/TMDL), enabling proper Git diffing, branching, merging, and collaboration—something binary .pbix files don’t support well. Prerequisites Power BI…
How to create a Power BI Semantic Model online (No need for PBI Desktop)
It has been in the service for quite a while so I thought I would blog about it in terms of how you can create a power BI semantic model simply using the web interface. This means you no longer need Power BI desktop, or a Windows PC to get going. This is quite a significant change because at times…
Restore Microsoft Items from Backup using Python & Fabric CLI
In my previous blog post I had shown you how to backup your Microsoft Fabric Items: Backing Up Your Microsoft Fabric Workspace: A Notebook-Driven Approach to Disaster Recovery – FourMoo | Microsoft Fabric | Power BI The next natural question is what happens when you want to restore one if the items that were previously backed up. In the steps…
Microsoft Fabric Lakehouse vs Warehouse 1 Billion Rows which will be faster?
My previous blog post got a lot of comments and suggestions which is great. You can view it here for reference: Microsoft Fabric: Why Warehouse Beats Lakehouse by 233% in Speed and 278% in Capacity Savings – FourMoo | Microsoft Fabric | Power BI I learnt a lot and based on the feedback people asked for me to compare the…
Microsoft Fabric: Why Warehouse Beats Lakehouse by 233% in Speed and 278% in Capacity Savings
After my previous blog post on the different semantic model options and at the same time working with a Fabric customer, it got me thinking which is faster and which consumes less capacity when ingesting data into Power BI either via the SQL Endpoint to a Lakehouse or a query from the Warehouse. Below you will find the information which…
How Much of Your Fabric Capacity Is Really Being Eaten by Background Jobs? (The 24-Hour Smoothing Trick Explained)
I was recently working with a customer and one of the questions they had is we are going to be running an ingestion process. We want to know how much Fabric Capacity this will be consuming. The challenge with this question is that in Fabric a background capacity gets smoothed over 24 hours. For example, when looking at the Capacity…
Direct Lake or Import which Semantic Model has the best performance
In this blog post I am going to show you how I completed the automated testing and then the results where I am going to compare Direct Lake, Import and DirectQuery and which one appears to be the best. As always, your testing may very or be different to my tests below. I would highly recommend that you use the…
Comparing Microsoft Direct Lake vs Import– Which Semantic Model performs best?
I was recently part of a discussion (which I have heard of multiple times), which was which semantic model to use in Microsoft Fabric. This was the source for this blog post where I am going to compare Microsoft Direct Lake (DL) to an Import Semantic Model. The goal is to first explain how I set up and configured the…
Backing Up Your Microsoft Fabric Workspace: A Notebook-Driven Approach to Disaster Recovery
In the high-stakes world of data architecture, where downtime can cascade into real business disruptions, I’ve learned that even the most robust platforms have their blind spots. Just last month, while collaborating with a client’s Architecture team on their disaster recovery strategy, we uncovered a subtle but critical gap in Microsoft Fabric: while OneLake thoughtfully mirrors data across multiple regions…
How to Access a Former Employee’s Power BI “My Workspace” and Recover Reports
One of the common challenges I’ve seen in organizations is when a team member leaves and their Power BI reports are stored in their personal My Workspace. These reports often contain valuable datasets and dashboards that are still in use or need to be maintained. So, how do you access and recover these reports? In this blog post, I’ll walk…