Restore Microsoft Items from Backup using Python & Fabric CLI
In my previous blog post I had shown you how to backup your Microsoft Fabric Items: Backing Up Your Microsoft Fabric Workspace: A Notebook-Driven Approach to Disaster Recovery – FourMoo | Microsoft Fabric | Power BI The next natural question is what happens when you want to restore one if the items that were previously backed up. In the steps…
Microsoft Fabric Lakehouse vs Warehouse 1 Billion Rows which will be faster?
My previous blog post got a lot of comments and suggestions which is great. You can view it here for reference: Microsoft Fabric: Why Warehouse Beats Lakehouse by 233% in Speed and 278% in Capacity Savings – FourMoo | Microsoft Fabric | Power BI I learnt a lot and based on the feedback people asked for me to compare the…
Backing Up Your Microsoft Fabric Workspace: A Notebook-Driven Approach to Disaster Recovery
In the high-stakes world of data architecture, where downtime can cascade into real business disruptions, I’ve learned that even the most robust platforms have their blind spots. Just last month, while collaborating with a client’s Architecture team on their disaster recovery strategy, we uncovered a subtle but critical gap in Microsoft Fabric: while OneLake thoughtfully mirrors data across multiple regions…
Running and viewing Automated Load Testing Results – Part 3
This blog post is going to detail how I run the load test and then view the load testing results to determine how the capacity has coped when I increase the number of users. Along with demonstrating how I automated the load testing without having to run it manually! Please find below the previous series in case. This is the…
Automating Load Testing Setting Up Your Fabric Lakehouse and Notebooks – Part 2
In today’s blog post I am going to show you how to set up the Lakehouse and Fabric notebooks so that you can then configure it to be ready to be used with the JSON file we created in the previous blog post. Series Details Part 1: Capturing Real Queries with Performance Analyzer Part 2 (This blog post): Setting Up…
Comparing Fabric Capacity Consumption – Notebook vs Warehouse SQL
I saw that there was an update where it is now possible to use the Microsoft Fabric Warehouse to copy data directly from OneLake into the Warehouse. This got me thinking, which would consume more capacity to get the data into the Warehouse table. As well as which one would be faster. To do this I am going to be…
How to use the Tabular Object Model using Semantic Link Labs in a Microsoft Fabric Notebook
In this blog post I am going to show you how to use the powerful Semantic Link Labs library for Tabular Object Model (TOM) for semantic model manipulation. The goal of this blog post is to give you an understanding of how to connect using TOM, then based on the documentation use one of the functions. Don’t get me wrong…
Using a Python Notebook Loop through a data frame and write once to a Lakehouse Table
In this blog post I am going to explain how to loop through a data frame to query data and write once to a Lakehouse table. The example I will use is to loop through a list of dates which I get from my date table, then query an API, append to an existing data frame and finally write once…
Using a Python Notebook to Query Multiple Lakehouse Tables into a Data frame
In this blog post I am going to explain how to query multiple Lakehouse tables into a data frame. The example I am going to use is when you want to load new data into your staging tables, but you need to know the max date from your previous data load. I am going to build on my previous blog…
Using a Python Notebook using Semantic Link Labs to write a DAX Query output to a Lakehouse Table
In this blog post I am going to explain how to use a Python Notebook using the Semantic Link module, to run a DAX query and write the output to a Lakehouse table. I will show you how to install a Python library and then use it within my python notebook. If you are interested in more details on why…