How I Built a Full Power BI Semantic Model in Minutes Using Agentic AI and GitHub Copilot
In this second part of my blog series, I will show you how I edited the required files and then created my semantic model by using the Agentic Tools.
To be transparent, I did use Claude to assist me with the initial setup steps below.
This is quite a long blog post, so maybe make yourself something to drink a cup of coffee, tea, beer, kombucha 😊
Here is a link to the previous blog post where I completed the setup steps: How to Set Up Agentic Semantic Model Development for Power BI Using GitHub Copilot on Windows
Visual Studio
I first had to load my folder where I had got all my files stored.
In my example I created a folder C:\my-fabric-folder and in this folder is where I had previously completed the setup files
At the end of the blog post I will have a link to where I have got all the files stored in my GitHub Repository.
As shown below I have got all my files in my folder.

- In VS Code, I click on Explorer and then Open Folder

- I then browsed to my folder location and clicked on Select Folder
- I was then prompted if I trust the Files in this Folder, I clicked on “Yes, I trust the authors”

Source Data
In my working example I downloaded the sample data after I had created a new Lakehouse in Microsoft Fabric.
NOTE: The source data has already been created as a star schema with Fact and Dimensions and has got the relevant Key columns in the table.
It is critical that your source data has got the dimensional modeling already applied with Dim, Facts and Key columns for this to work. This is because the Power BI Semantic model is designed to work with a star schema, as well as all of the skills.

Once this was loaded, I could see the sample data in my Lakehouse.

Setting up the Requirements
The first step is to set up the requirements by opening the REQUIREMENTS.md file
![]()
Having the outcome requirements already known
It is critical that before starting with the requirements details below the outcome requirements should already be defined, documented and know what the expected outputs should be.
What this means is that I still had to understand what the business use case is, what does my source data look like, what are my key columns, what are my relationships, measures etc before going ahead.
There was still work to be done and understood before completing the requirements.
Fabric Environment
Here is where I will set up the Fabric Environment requirements.
- This is my Workspace Name in Microsoft Fabric
- This is the workspace GUID, which I got from the URL, is after the /groups/
- This is the Lakehouse Name
- I got the Lakehouse GUID, by navigating to the Lakehouse, where I could then find the Lakehouse GUID in the URL after the /lakehouse/
- This is the SQL Analytics Endpoint for my lakehouse
- The name I want to give for my semantic model.
- Below is an example of what I completed.

Defining the Source tables in the Lakehouse
The next section is where I then defined my source tables in the lakehouse.
This is where I had to explain what the tables were, the type of table and grain of the table.
- The first row is the fact table.
- I gave it the table name ‘fact_sale’
- NOTE: It should match the Lakehouse table name.
- The type of table is a ‘fact’ table
- The grain of the table is ‘One sales transaction’ for every row
- The notes were that it is the main fact table.
- I gave it the table name ‘fact_sale’
- The rest of the rows were dimension tables as explained below
- The table name was the Lakehouse name EG ‘dimension_city’
- The type of table is a ‘dimension’ table
- The grain was one row per item. EG ‘One City’
- Then in the notes for each dimension table is what was the Key for each table
- Below is an example of what I completed.

Relationship
Next, I had to define the relationships in my star schema.
As shown below, I created the relationship mapping between my Fact table and my dimension table.
I also put in the columns for each relationship so it would map correctly.

Key Business Measures
In this section this is where I first define what I want the DAX measures to contain in terms of descriptions, formatting etc.
I also gave it a working example of what I wanted to look like for the TMDL syntax.

The second part of my business measures is where I defined what measures I wanted, a description and the formula.

Business Questions to Answer
Based on the business analyst details I wanted to include some requirements that I wanted this semantic model to answer.

Dimension Details
I defined how I wanted my dimension tables to work within my semantic model.
I only had very specific requirements for my date table as shown below.

I then made some overall requirements for the rest of the tables with how I wanted them to be renamed in the semantic model.

Modeling Preferences
In this section is where I defined the overall semantic model preferences.
This is typically what I would be doing for each of the measures where applicable, formatting etc. But by doing it once here it will then be applied for the entire semantic model.

Development Style
This is where I gave some additional details as to how I want this to be developed.
I first wanted it to create a specification (spec) document, which I could review to make sure that it implemented what I expected before I deploy it to Microsoft Fabric.

Additional Context and Notes
There is a bit of repetition here, but it is better to say safe than sorry.
I also made sure to define that I want this to be created in Direct Lake as highlighted below.

Acceptance
The last step is to define the acceptance criteria for how I want this to be created.
If this is what I get working at the end, then it is considered a success

Authentication Setup
Before starting with the GitHub Copilot chat, I had to make sure I could authenticate and have a valid connection (or token to the Fabric Service) to my tenant.
- To run the commands, I ran them in the VS Code Terminal
- I first ran the following to connect to Azure (Microsoft Fabric Service)
- az login

- I clicked on “Work or school account” and clicked Continue
- I then got the Microsoft Azure login screen
- NOTE: Make sure you find the screen below to log in and authenticate.
- NOTE II: If you are already logged in you might not get this screen below, instead you would get the Password screen or how you would normally log in.

- Then I followed the prompts to log in successfully.
- Because I belong to multiple tenants and I have a few subscriptions I got the following below
- NOTE: Yours might look similar or only have 1 option

- I put in 1
- Once completed I was then successfully authenticated and logged in.
- I needed to get the Microsoft Fabric Token by doing the following command below.
- az account get-access-token –resource https://api.fabric.microsoft.com
- I now had the access token to Microsoft Fabric.
Using GitHub Copilot Chat
Finally, I am at the point where I can now start using GitHub Copilot Chat to start doing some of the work for me. It feels like I have had to do all the work so far!
I clicked to open the chat
![]()
I then configured my Chat settings as shown below.
- I made sure it was in agent mode.
- I then changed the model to be “Claude Sonnet 4.6 Medium”
- I clicked on the Auto mode, then more settings.
- Claude Sonnet 4.6 then the “>”
- And selected “Medium”

- I could then see my chat settings as shown below
01-connect.md
This is the first set of commands which will read the REQUIREMENTS.md file, skills and complete some of the other tasks.
- I opened this file and copied all the content into the chat.

- I then clicked on the up arrow to start the chat.
- It then went and started completing the tasks in the chat.
- I had to keep watching the chat, because at times it would prompt me to run tasks as an example shown below.
- I clicked Allow
- Once it is completed, I got a summary confirming that it can connect to the requirements, find the tables and other useful information.


02-build-spec.md
This is the file that will define how to build the spec document. Once this has been run it should then create a spec markdown file for me to look at.
- I copied all the contents from the file, put it into the chat and started the chat.
- Once it was finished, I could see that it had created a spec document
- It also prompted me if I wanted to keep the file it created

- I clicked on Keep
- At the end of the chat it asked me to review the spec and once I am happy or I have made the required changes to say “Proceed” or “GO”
- In the Explorer, under the folder called “specs” I opened the file “semantic-model-spec.md”
- This file has got a lot of content, and I made sure to review the entire file.
- This is because this is what is going to be created.
- I would like to highlight that I mis-spelt “SalesPersonKey”, which it picked up and corrected.

- I was happy with the specs, so I went back to the chat and put in “GO”
- When it is running, I could see a Todos list, which I expanded to see what it had to complete
- While it was creating the files, I was clicking on “Keep”

- NOTE: I did not have to do this for each file, as it will continue to create the files.
- Once the chat completed, I could then see what it had completed

- This was very impressive and very quick!
03-implement.md
This file will now implement what is defined in the spec markdown document and deploy it to Microsoft Fabric.
- I copied all the contents and put it into the chat.
- Once again, I could see what it was planning to do
- I was then prompted to allow MCP operations to run, when prompted I clicked on “Allow in this session”
- While this is running, I could see the context window and if it is starting to get too full.

- Once completed it had created the PBIP with the TMDL files.

- The next step was to prompt it to deploy it to the Fabric Workspace.
- I was prompted to allow once again
- After a few minutes I got confirmation that the semantic model had been successfully deployed.

- I then validated this in the Fabric Service.

04-validation
The next step was to run some validation checks to make sure that the measures, tables and relationships are working as expected.
- I copied the contents from the 04-Validation.md
- Got this prompt to check BPA and fix, I said Yes or I could have also clicked on Yes and clicked Submit.

- It then ran for some time doing the checks and completed with the details below.

Testing the semantic model
I now wanted to make sure it works successfully.
Reframing / Refreshing the semantic model
- I tried to refresh and got error

- I went back to GitHub Chat and asked the following to fix the error.
- It then ran some commands, found the error in _measure table and fixed it.
- This is great to see that it not only reviewed what was in the Fabric Service but also fixed the issue.
- I then reframed/refreshed the semantic model, and it worked.
Creating report
I then used copilot in the PBI Service to create a report.
I clicked on Create Report, opened Copilot and put in the following prompt below.


It then created the following report.










