In this blog post I am going to explain how to use a Python Notebook using the Semantic Link module, to run a DAX query and write the output to a Lakehouse table.

I will show you how to install a Python library and then use it within my python notebook.

If you are interested in more details on why I am using a Python Notebook please read this blog post: What are the advantages of using Python Notebook vs a Spark Notebook in Microsoft Fabric – FourMoo | Microsoft Fabric | Power BI

In my previous blog post I explained how to use SQL code to read from Delta tables in Microsoft Fabric which can be found here: How to use SQL Code to read from Delta Tables in a Python Notebook – FourMoo | Microsoft Fabric | Power BI

This blog post details how to write to a Lakehouse Table: How to use SQL Code to write to Delta Tables in a Python Notebook – FourMoo | Microsoft Fabric | Power BI

At the end of this blog post I will have a link to the Python notebook if you want to try this code out in your own environment.

What are Semantic Link Labs?

Semantic Link Labs is a Python library that enables you to use Microsoft Fabric functions directly from a notebook. It has a lot of amazing features and is continually building additional features. Below is the link where you can find more details.

microsoft/semantic-link-labs: Early access to new features for Microsoft Fabric’s Semantic Link.

In my example for this blog post I am going to be using the function called “evaluate_dax” which will run a DAX query against a Semantic Model.

How to install and use Semantic Link Labs

Because we are using a python notebook, we can install the library as shown below.

A screenshot of a computer

AI-generated content may be incorrect.

How to write a DAX query output to a Lakehouse table

In my working example I am going to get all the data from my Customer table in my semantic model.

A screenshot of a computer screen

AI-generated content may be incorrect.

First, I make sure that I have installed the Semantic Link and I set my Workspace Name and Semantic Model name where I want to run the query against.

Next, is where I will run the DAX query as shown below.

In my example I am getting all the data from the Customer table.

A screenshot of a computer program

AI-generated content may be incorrect.

As shown below, this is the output that I get when running the DAX query, as you can see because I am running a DAX query it is returning the table name (1) and the column names in square brackets (2)

To fix this I then used the following code to remove the table name and remove the square brackets. As well as replacing any column names with an underscore.

Writing the data frame to my Lakehouse table

As highlighted in my previous blog post I am then going to write my data frame to my Lakehouse table.

When I first ran the code below I got the following error.

When I went back and looked at my data frame I had DateTime columns, but they were not in the right format.

I then updated the previous code cell and re-ran it to get the dates in the right format.

A screenshot of a computer code

AI-generated content may be incorrect.

This time it ran successfully.

I could then see the table in my Lakehouse.

And here is the data on the Lakehouse table.

Summary

In this blog post I have shown you an example of querying a semantic model using DAX and writing it on a Lakehouse table.

In the next blog post I will show you how to query data from multiple Lakehouse tables (The example will be to get the last loaded Date, which can then be used to query data to be loaded)

You can find a link to the notebook I used above here: Fabric/Blog – Python – Run DAX Query and write to LH Table.ipynb at main · GilbertQue/Fabric

Thanks for reading, any comments or suggestions please let me know!