Scalability – Part 2 Migrating AAS to PPU
Thanks for reading and welcome to part 2 of my series on migrating AAS (Azure Analysis Services) to PPU (Power BI Premium per User)
If you missed the first part of the series here is the link here: Query Performance – Part 1 Migrating Azure Analysis Services to Power BI Premium Per User – Reporting/Analytics Made easy with FourMoo and Power BI
In this blog post I am going to investigate how well does PPU scale when comparing it to AAS.
When comparing AAS to PPU, I must find the same size AAS size to what we get with PPU.
In the image below this would be an instance size of S4 because I am comparing the amount of memory that can be consumed.
Likewise, when I look at the sizing for PPU from the Microsoft doc’s I can see that the model size limit is 100 GB
What this means to me is that with PPU I can getting very similar if not better capabilities when comparing AAS with the instance size of an S4 with PPU.
PPU using Gen2 Architecture
There is some additional functionality which is only available in PPU which I will highlight below.
- When using AAS and there is a dataset refresh, the refresh consumes additional memory. As shown below you can see how this adds to the total increase of memory consumption which has been allocated to the AAS instance.
When using AAS the Total memory used by all the datasets is loaded into memory. The combination of all these datasets cannot exceed 100 GB when using an S4 or Power BI Premium P3.
- Now when I compare this to a dataset refresh in PPU, this is what it will look like below.
In PPU and Premium Gen2 the total INDIVIDUAL dataset cannot exceed 100 GB for a PPU or P3.
What times means is that if a dataset is refreshing it has its own 100 GB allocation.
- My Sales dataset memory could consume 95 GB and my orders dataset memory could consume 25 GB of memory.
For example, if my dataset was consuming 95 GB and when I started an incremental refresh which would then consume an additional 7 GB of memory, it would fail because my Sales dataset was larger than the 100 GB allocation.
- My Orders dataset would remain unaffected by the refreshing of the Sales dataset.
Memory allocation for DAX Queries
Another thing to consider is that when a DAX query is run, at times if there DAX measure is complex or poorly written it will require additional computation and it will consume memory to do this. I like to think of it as creating a temporary table.
- When this happens in AAS, this additional memory will be allocated to the total memory consumption.
With PPU, this will not count towards the total memory consumption BUT BUT, there is a caveat.
- The amount of memory that a DAX measure can consume is limited to 10 GB.
- To be totally honest if I write a DAX measure and that consumes 10GB of memory I have got bigger problems that I need to worry about (Such as the dataset modelling, or DAX measure)
With regards to users querying the model, as with both AAS and PPU, they both will use the built-in caching options both on the engine and in the browser if enabled in the settings to cache common queries. This ensures that the performance is super-fast, and the visuals are responsive.
While this applies to both AAS and PPU, Microsoft are managing the underlying hardware and will ensure that it is using the specific hardware that is optimized for Tabular Models.
And I am fairly confident that they are consistently looking at upgrading the hardware to make everything run that much faster and smoother!
When looking to compare the different versions between AAS and PPU they use the same underlying engine, but AAS has a different SKU where some features are disabled.
With that being said, the DAX functionality, query performance, refresh performance is very closely aligned and work in the same fashion.
PPU is a superset of Analysis Services, where it can not only scale very well, but has the additional features like aggregations, composite DirectQuery Models, Incremental refreshing, and more features available in PPU.
This translates to me, that PPU will be able to scale if the model size increases, or the number of users increase.
Thanks for reading and any comments or suggestions are most welcome.
Finally, I would like to thank the people on the Power BI team for assisting me in sharing the right information on how things work.