Understanding what is consuming your Fabric Storage
This blog post will show you how to understand what is consuming your Fabric Storage.
If you want to know how I got this data, please read my previous blog post View all your Storage consumed in Microsoft Fabric – Lakehouse Files, Tables and Warehouses – FourMoo
With this Semantic model below, I could also create alerts to notify based on certain thresholds. For example, if total storage in a single App workspace is more than 100GB send me an alert (This could be done using Power Automate). Or it could be on too many files being stored, or even looking at the Parquet file sizes and if they are too small they would then need to be optimised (for better performance).
UPDATE: 2024-10-31
Thanks to a comment from Marco, I realised that I had not updated a table in my subsequent blog post which queries the tables. Long story short, is I was missing the table called “WorkspaceDetails”.
At the end of this blog post please re-download the PBIT, which has got the updated connection string details (which includes the WorkspaceDetails table).
Please refer to my previous blog post on how to get the WorkspaceDetails table, the link is above.
This is what my overview report looks like.
NOTE: When you delete tables, files or warehouses it can take a few days for them to be deleted. This is because the initial delete is a soft delete (which allows for deleted files to be restored).
What that means is that if I had to delete 10GB worth of tables, it could take 7 days until I would see that drop in the storage report.
Getting your data into the Semantic Model
If you download the PBIT in the link at the end of the blog post, when you open the PBIT you will need to put in the Connection String and Lakehouse Name.
To get the Connection String, go to the Lakehouse where you have stored your data.
- Click on the settings button.
- Click on SQL analytics endpoint to get to the connection string.
- Copy the connection string.
Also make a note of your Lakehouse Name.
Now when you open the PBIT you will need to put in the details as shown below.
- Put in the Lakehouse Name
- Put in the SQL Connection String.
In my report I have also created additional pages showing each of the area’s where there is storage within Fabric, which is the Lakehouse Tables, Files and Warehouses.
Below is an overview of what it looks like.
Lakehouse Tables
Files
Warehouses
And here is an additional view where I can see how the storage is changing over time.
Summary
In this blog post I have shown you how I took my data stored in the Lakehouse. I have then put this into a Power BI Semantic Model and created the report.
Here is a link to the Power BI Template, which you can use to connect to your Lakehouse data.
Fabric – Storage Collection.pbit
Thanks for reading, any comments or questions please leave them in the comments section below.
I tried the PBIT but even if I specify the parameter, it still has a connection to something on your tenant and requires authentication there – can you check that?