This blog post is about using MCP to tune DAX and then using the Automated Load Testing does it reduce capacity. I originally did not plan for this post but after viewing the details from Justin Martin – DAX Performance Tuner | LinkedIn I had to give it a go. It was then easy for me to test if the performance improved using the automated Load testing.

Please find below the previous series in case. This is the first time you have come across this blog post and want to understand what has been done previously.

Series Details

Part 1: Capturing Real Queries with Performance Analyzer

Part 2: Setting Up Your Fabric Lakehouse and Notebooks

Part 3: Running, Analyzing Load Tests and Automation of Load Testing

Part 4 (This blog post): Using the DAX Tuner MCP to fix slow DAX measures

Using the DAX Performance Tuner MCP Server

You can watch Justin’s video here, or follow along in the steps below, which is a quick run through.

I downloaded the files from the link below and extracted them into a folder

fabric-toolbox/tools/DAXPerformanceTunerMCPServer at main · microsoft/fabric-toolbox

Next, I ran the setup.bat to make sure all the Python Libraries and .Net installations were complete.

I then opened Visual Studio Code.

To get the MCP Server I completed the following steps.

  1. I opened the file called “mcp.json”
  2. I then verified that the location of the files is in the directory I extracted the files to.
  3. It can be quite small, to start the MCP server I clicked on Start

I could then see the MCP Server running

I could then verify that I can see some of the MCP commands by going into GitHub Copilot chat and using the #dax

A screenshot of a computer program

AI-generated content may be incorrect.

Connecting to my Semantic Model in the Power BI Service

Because my semantic model is using DirectLake I could connect to it directly in the Power BI Service.

NOTE: I would highly recommend always working on a local copy of a semantic model, even if it is in direct link mode, to ensure that you have a working copy of the file.

To connect to the semantic model in the Power BI Service I completed the following commands below.

  • I changed the Agent to use GTP-5
  • connect_to_dataset workspace_name “Fabric_FourMoo” dataset_name “WWI_DL”
  • Prompted
  • Clicked Allow
  • Authenticated
  • A screenshot of a computer

AI-generated content may be incorrect.
  • Closed window
  • Succcessfully connted
  • A screenshot of a computer

AI-generated content may be incorrect.

Optimizing a single DAX measure

In the steps below is the instructions on which DAX measure to optimize.

I had a measure I had created called “BadMeasure”

  • To begin the optimization, I put the following into the chat
  • optimize measure called “BadMeasure”
  • I was then prompted to run the query to begin the analysis
  • A screenshot of a computer

AI-generated content may be incorrect.
  • I clicked Allow
  • What then happens is that the MCP server will start to run the measure and begin its optimization.
  • There will be a few times when I will be prompted to allow the optimization to run as shown below.
  • A screenshot of a computer

AI-generated content may be incorrect.
  • Each time I clicked on Allow until it completed.
  • Once it was completed, I got the following Performance comparison as shown below
  • Because I was using DirectLake mode it automatically updated the measure.

Now that the measure had been optimized it was time to re-run the load testing.

Re-running the automated load testing

I went back to my previous pipeline I had created in a previous blog post Part 3: Running, Analyzing Load Tests and Automation of Load Testing

A screenshot of a computer

AI-generated content may be incorrect.

I then re-ran this pipeline and waited for it to finish. This is where the true value of the automated load testing happens

I refreshed the DirectLake model and viewed the results.

As shown below you can see that the DAX measure called “BadMeasure” was optimized and the query response time decreased, meaning it was using less capacity units.

This is the last test as highlighted below.

A screenshot of a computer

AI-generated content may be incorrect.

Summary

In this blog post I connected to the DAX Performance Tuner, where it optimized the DAX measure for me. I then re-used my previous Load Test to see if it did make the measure quicker, WHICH IT DID!

Thanks for following along in this series I hope you find it useful and informative.

Any questions or comments are most welcome.