How to complete granular deployment of Power BI Desktop changes to the Power BI Service (Using PPU)
Now that we have got Power BI Premium per User (PPU), this opens a whole host of new capabilities.
In this blog post I am going to demonstrate how to make a granular deployment where I will create a new column in my City table, and only deploy those changes.
What this means is that by deploying only the column change to my PPU dataset, I am only updating the column in the table.
This now saves me from doing the following tasks previously:
- Time taken to refresh the PBIX file so that the data is up to date.
- Re-uploading my PBIX.
- If configured re-creating the incremental refreshing
- Time and effort to upload and wait for dataset refresh.
- Quick updates to my dataset.
I will not have to worry about saving my PBIX file, file and if configured re-creating the incremental refreshing. This saves me a lot of time and effort.
Separating dataset and report
The first thing that I want to mention it is always recommended to separate the dataset and report.
You might be thinking why the separation?
Some of the reasons are:
- People curating and updating the dataset can do this independently.
- People creating reports and dashboards can create this independently.
- Quicker and easier deployment of changes to datasets without affecting the reports.
- Reports can be updated without having to rely on dataset updates.
Updating my existing PBIX file by adding a new column
In this example I am now going to add a new column called “City-State Province” to my City table
As you can see below, I have my existing table.
I then went into Power Query and added the new column City-State Province
I could then see the column in my table.
- I then saved my PBIX file.
Using the ALM Toolkit to compare and deploy the granular changes to my PPU Dataset
I am now at the point where I can now deploy only the changes, I want to deploy using the ALM Toolkit.
Changing the default Options for ALM Toolkit
After installing ALM Toolkit what I like to do is to change the default options which I think is the best.
What I do is to ensure I have got the following options enabled as shown below
The main option here is the “Processing Option”, where I set this to “Do Not Process”, this ensures that when I complete a deployment it will deploy the changes. I will then have to go in manually and process the affected tables.
I prefer this option because it allows for a much quicker deployment and allows me to process the affected tables when I am ready to.
NOTE: You can download the ALM Toolkit here: ALM Toolkit
Deploying the changes
- I had already installed the ALM Toolkit.
In Power BI Desktop I went to External Tools and selected ALM Toolkit
I am then presented on what I want to compare.
- In my example I am comparing my PBIX file to my dataset in the PPU App Workspace as shown below.
- NOTE: When I click on the Dataset it then prompts me to log in and authenticate.
- I then clicked Ok, and the comparison began.
By default, after the comparison is complete it will show all actions.
- ACTIONS options are Update to Update, or Skip to skip the action.
What I typically do is to click on the drop down under Select Actions and select “Hide Skip Objects”, this makes it easier for me to see what update actions there are.
Now, I can see that my City table is left, and it has got the action set to Update.
What I can now also do, is if I look at the bottom of the screen there is a comparison to show me where the changes are.
If you look at the bottom right-hand side you will see that I have highlighted there is a green section next to a red section.
- This is showing where there is a difference.
Scrolling down I can then see the new column that has been added to the table.
When I scroll down to the end, I can then see what those differences are for the Power Query for the table, as shown below I can see my new column that was added.
Next, I clicked on Validate Selection, this is where it validates everything to make sure it can be deployed.
Based on the options I have selected to show me warnings and informational messages, which is displayed as shown below.
The option to Update is now available.
- I do have other things I could such as Generate script or Report differences, but I will leave that to another blog post.
I click on Update, and then get a prompt asking me if I am sure I want to update the target.
- Of course, I want to! And I click Yes.
Once the deployment is complete, I can then see it was successfully deployed.
I was then prompted if I want to refresh the comparison.
- I clicked on No because I know what has been deployed.
I then checked in the Power BI Service and I can see the update column.
Processing the affected table
The final step was for me to now process the affected table, which would allow the column [City-State Province]
To do this I can refresh this using SQL Server Management Studio (SSMS) or refreshing the dataset via the Power BI Service.
I am going to show you how I did this using SSMS.
- I went into SSMS, found my database, then table.
I right clicked on the table and selected Process Table.
I then changed Mode to “Process Full” and made sure that City was selected as shown below.
I then clicked on Script, which then scripts it into an SSMS Query window
- I do this, because it allows me to view the entire processing details and if there is an error I will be able to see it in the text.
I then clicked on Execute.
When the processing is completed, I then see the following in the messages below.
- NOTE: Currently there is a bug when using SSMS with the following screen shot below, it will still be successful and complete the processing of the table
Once completed this is what the Results look like
- I then go back and validate my data and I can see the updated column with values as shown below.
In this blog post I have demonstrated how to complete a granular deployment of a new column to a PPU dataset.
I hope that you found this useful.
I am thinking of creating a similar guide on how to create or update an existing measure to a PPU dataset without having to publish the entire dataset. If this is something you feel will be of value please let me know.
Thanks for reading