Power BI InfoGraphic Update – Sep 2018

It has been a few months since I have updated my Power BI InfoGraphic and with the speed at which this all changes and updates I have always had a challenge to keep it up to date.

The following items have been added with links to more details:

Please find below the latest version as at September 2018. I always also keep the latest version of the InfoGraphic here: Power BI InfoGraphic Overview

You are welcome to share this infographic with anyone, as long as the author is given credit for the creation.

If there is anything missing or needs to be updated, please leave it in the comments section below.

BI-RoundUp – Power BI (Sep Update for Power BI Desktop – Dashboard Comments – Community Contribution to PowerShell Cmdlets – Power BI PowerShell Cmdlets available in Azure Cloud Shell)

Welcome to another week of my Power BI Round Up. And as expected there is another massive Power BI Desktop Update.

Power BI – September Update for Power BI Desktop

As always I am going to over the new features and highlight what I think is really relevant in terms of what has been released.

Reporting

With the added support for categorical fields on the X-Axis this now makes the scatter charts more flexible.


As shown above there now is the capability to be able to copy a single value, which will copy the value in its unformatted form, which could be used to put into a search of another application.

Whilst when using the Copy selection feature, it will copy everything that you have highlighted in your table or matrix. And when pasting this, it will keep the column headers as well as the formatting options which you can then paste into Excel or another program. This is really a great piece of functionality which makes it easier when certain data needs to be copied into another system.

There are now some built in report theme’s which are great for those people who just want to move from the default colours, but do not want to create their own custom themes.

Report Page Tooltips are now generally available meaning that you will not have to enable it as a preview feature.

Along with that as shown above there is now report page tooltips available for Cards.

There have also been additional improvements for accessibility in the formatting pane, making it easier to navigate, as well as when using the screen reader there is more support.

Analytics

This is a really big, BIG one having the ability to have aggregated tables in your data model. What this means is that you can have a highly aggregated table of your data which might answer 60 – 70% of the queries. When the queries are run they will use the data in the aggregated table, which in turn results in exceptionally super-fast query response times. And if it does not hit the aggregated table it will then go down to the source table, which could be imported or using Direct Query mode, which means then that the entire data model is actually very small but very fast.

There is a lot more to this, so I suggest reading up the documentation that is included in the blog post. But this is something that I have been waiting for and cannot wait to try it out and get it working.

Finally there is now support for RLS with Q&A, this means if you have had a dataset that has had Row Level Security (RLS) previously you could not use Q&A. This has now been fixed and Q&A is available which is awesome.

Data Connectivity

As shown above there is a new connector where you can connect to PDF files and it will attempt to extract table data from your PDF files.

There is now support for the SAP BW Connector measure properties

There is a new connector called dataflows which will soon be in limited preview which will allow users to connect to an existing data flow.

Data Preparation

As per my first image, and above, there is now Intellisense when going into the Advanced Editor in the Power Query Editor. This is so awesome. There have been so many times in the past when I have not known which M function to use. As well as having syntax errors, and a lot of that pain is now gone with having Intellisense.

The add columns from examples now supports text padding your data, as shown above you can pad the text with zeros for data where you want them all to have the same length.

It is interesting to see that right at the end of the blog post they highlighted that they are working on being able to copy visuals between PBIX files. How cool is that??

You can find all the blog post details here: Power BI Desktop September 2018 Feature Summary

Power BI – Dashboard Comments

This was announced at the Microsoft Business Applications Summit and it is great to see that you now can add comments to Power BI Dashboards, as well as individual tiles on the dashboard.

It is also great that you can tag people in the comments, which will then send an email or send a push notification to the Power BI Android and IOS Mobile App.

At the end of the blog post, they also indicated that this will be coming to the reports, so stay tuned.

You can find all the details here: Announcing Dashboard Comments in Power BI: Your new hub to discuss data and collaborate with others

Power BI – Community Contribution to the PowerShell Cmdlets

This is the first contribution from the Power BI Community which adds to the existing Power BI PowerShell management cmdlets. Where Kenichiro Nakamura has added functionality to work with datasets, tables and columns.

You can find a lot more detail in the blog post: Celebrating our first community contribution to the Power BI management cmdlets

Power BI – PowerShell Cmdlets available in Azure Cloud Shell

What this means is you can now use the Azure Cloud Shell to run the Power BI PowerShell Cmdlets. This also means that there is no need to try install and configure the required PowerShell modules, because this is all maintained in the Azure Cloud Shell.

I would suggest looking into the storage options because depending on how much storage you need, if you have an existing Azure Subscription this could be a very good option.

You can find more details on how the Azure Cloud Shell works, as well as the blog post details here: Power BI Management cmdlets in Azure Cloud Shell

What runs under the cover when I open Power BI Desktop

I am always interested in what happens under the covers when I open Power BI Desktop. So I did a little digging and I got the inspiration from Macro Russo when he did his Webinar as well as presented at the Queensland Power BI User Group on “My Power BI report is slow: what should I do?”

What happens when I click on Power BI Desktop.exe?

This is the list of associated processes that are opened when Power BI Desktop is started, which I will explain what they are below.

This is what it looks like for me when I am running Windows 10 and I have a look in the Task Manager under Processes

  • CefSharp.BrowserSubprocess
    • Because Power BI Desktop runs in the Power BI Service, which is essentially a website and all the visuals are rendered in a browser.
    • My understanding is that within Power BI Desktop it is simulating how it will run the Power BI Service.
    • As per Marco’s Webinar, if this consumes a lot of memory or CPU this is potentially why your Power BI Report is slow.
  • Console Windows Host
    • UPDATE (06 Sep 2018) – I got a reply from Amanda Cofsky from the Microsoft Power BI team, who said that the Console Windows Host is the “Analysis Services Engine Console Output”, which is generally used by the Microsoft Engineers for debugging purposes.
  • Microsoft Mashup Evaluation Container
    • This is the Power Query Engine.
    • This is responsible for processing all the steps in the Power Query Editor. Which gets data from my sources, transforms it and then loads it into my data model.
    • When I look at a Server where I have got the On-Premise Data Gateway installed I will see a lot of instances of the Microsoft Mashup Evaluation Container running. This is because this is where my data gets loaded and transformed into tables before sending to the Power BI Service.
    • This is the executable which is the starting point and container for all the processes that are run within Power BI Desktop.
    • This is where all the magic happens, it is an analytical data engine which leverages In-Memory technology to achieve incredible compression using the X-Velocity Engine and blazing fast query response times by loading all the data into memory.
    • This is where all the data gets loaded from Power Query into the data model.
    • This process can have the highest memory usage.
    • If I have an expensive DAX measure which must get most of its data from the storage engine I will see an increase in memory utilization and CPU during the evaluation and running of the DAX measure. Which once again as per Marco’s Webinar is a great indicator as to why my Power BI Report is slow.

I hope that this has given you some insights into what runs under the cover in Power BI Desktop and that there are quite a few moving parts that work together seamlessly to make the report creation and development experience so seamless and fast when developing Power BI Reports.

As always if there are any questions or you have more details and insights into the details above, please let me know and I will happily update the details in this blog post.

Thanks for reading!

Review of new Features coming to Power BI by Oct 2018 – From Business Applications Summit

I was fortunate to attend my first ever Business Applications Summit in Seattle.

I had the pleasure of sharing an Airbnb with Matt Allington, Phil Seamark and Miguel Escobar, it was great to spend time with these Power BI Legends. I also did meet a lot of people and got to chat with people in the Microsoft Power BI team, which was something I will always remember.

I also was fortunate to present 2 sessions (Unfortunately Reza and Leila could not make it) at the Business Applications Summit, both sessions went well, and the feedback was positive.

Ok, so enough about me, let me get onto ALL the new features that are planned to be coming to Power BI until October 2018.

One thing I can say is that I am SUPER excited with the new proposed features that are coming, it is most definitely going to make Power BI the go to BI tool going forward.

Along with this it is also growing up, and by that, I mean more enterprise features are coming, which means that it soon will be able to be implemented into a smaller organisation or a fortune 500 company.

Personally, I cannot wait to learn all the new features and start to implement them at customers. One caveat is I am certain that some features might take a bit longer to get into the service or could possibly change.

NOTE: This might be a bit of a longer post, so buckle up, here is the link to where I got a lot of the information: Overview of Business Intelligence October ’18 release

Other Features not mentioned in any of the notes

Below are some of the other features that I did not find in the release notes but there were some demo’s or pictures.

Print to PDF

As you can see above, coming to Power BI will be the ability to print to PDF which will look exactly like your Power BI report.

Display Folders – Multiple selection settings

As you can see above there is the ability to be able to set multiple measures or columns into a Display Folder.

Not only that but you will also be able to complete the settings over multiple columns.

And something not in this picture is the capability to see multiple data source views if you have hundreds of tables to make the data modelling experience easier.

Python Support

As you can see above there is Python Support coming to Power BI!

Personal Bookmarks

As shown above, there will be personal bookmarks coming to Power BI

Power BI Desktop

Below are all the Power BI Desktop upcoming features

Ad-Hoc Data Exploration

What this will be a user who does not have edit access to a report will be able to look at a chart on a different axis or change the chart type which could be more meaningful to the user.

It will be used with an option to choose a report and select “Explore from here”

Aggregations

This is a big change for Power BI where there the underlying data is a really large dataset that is stored in Spark or a SQL Server database. When connecting with DirectQuery you will be able to define aggregations which will cache just the aggregated data into memory into your model.

This will allow to have a dual mode so that if the query can be answered by the aggregated cache that will be used, and if not, it will then query the underlying DirectQuery source.

Composite Models (Available Now)

What Composite Models allow you to do, is to allow you to have data in one Power BI Desktop file where you are getting data from DirectQuery and imported data sources.

This is an amazing feature and I know something that a lot of people have been asking for.

With this you also now by default will have all relationships set to Many:Many.

As with the details it is always suggested to ensure that your DirectQuery source has been tuned and has the capacity to be able to answer the queries from your users to ensure that the users get the super-fast reporting experience.

Currently the Composite Models do not support LiveConnection sources which only relates to SQL Server Analysis Services (SSAS) Multi-dimensional or Tabular.

Copy data from table and matrix visuals

Coming to Power BI Desktop and the Power BI Service once implemented will be the ability to be to copy data from a matrix or table into another application.

Custom Fonts

You will soon be able to use any fonts that you want in your Power BI reports. All that will be required to work is that the same font will need to be installed on their computer in order for them to see it. If they do not have it installed it will fall back to the default font.

Expand / Collapse in Matrix Visual

As shown below you can see the upcoming option to expand or collapse rows in a matrix visual. There also was the indication that they want to bring more pivot table features from Excel to Power BI.

Expression-based formatting

What Expression based formatting is, is where by using DAX you will be able to format almost anything in your Power BI report. The potential is to use expression-based formatting for the following below and possibly more that I cannot think of

  • Titles of Visuals
  • Line widths of visuals
  • KPIs based colours

From what I did see, there will be an fx button next to almost everything in the visual properties and elsewhere.

Q&A Telemetry Pipeline

This will allow access to the Q&A telemetry to see what the users are using Q&A for, which will allow you to further customize your Q&A Linguistic settings. The data will first be scrubbed for PII data.

Dashboard and Report wallpapers

Coming to both the Power BI Dashboards and reports will be the ability to use wall papers to cover the grey area behind your reports.

Show measures as clickable URLS

As you can see above you will be able to create a link, with a measure so that this can dynamically link to a Power BI Report to any other application which you can access via a URL

Theming over report level and visual container styles

There will be a theming update coming to both report level and visual containers in Power BI. From what I understood it would be similar to the theming that you can currently do in Power Point.

New Power Query Capabilities

There are a whole host of Power Query Updates as detailed below.

Intellisense support for the M formula language

Intellisense will be coming to the advanced editor in Power Query. This is something that I know a lot of people have been asking for. Not only that by Power Query will be coming to Microsoft Flow also.

Smart Data Prep

There are smart data preparation coming to Power Query, with the following initial features below.

  • Data extraction from semi-structured sources like PDF files.
    • This is something a lot of people have been asking for and I have seen it in action and it is awesome, where it can take data out of tables in a PDF and extract it into Power Query.
  • HTML pages.
    • A smarter experience to understand what details you want from an HTML Page.
  • Fuzzy-matching algorithms to rationalize and normalize data based on similarity patterns.
    • This is where it will try and match data based on your columns, to try and guess what the value should be when say the data is miss spelled.
  • Data profiling capabilities.
    • As you can see from the above image, there will be data profiling which will enable you to have a look and see if the data is as expected.
    • An example is if you know that your Customer Number should only be 5 characters long, with the data profiling you will be able to see if it is meeting your criteria.

Power Query Community Website

As with Power BI, there will be a Power Query Website launching later this year.

Certified Custom Connectors

There will be certified custom connectors which will be available to be plugged into Power Query. As part of this process the custom connectors will be certified Currently there are no additional details on how this process will be completed, but no doubt it will explained as time goes on.

Power BI Service

Below are all the updates to the Power BI Service.

Power BI Home

Power BI Home is a new place for users to start their Power BI Journey with the following sections.

  • The Top section will contain the users most important dashboards and reports.
  • The second section will contain their favourite items, as well as most frequently accessed reports and dashboards.
  • Whilst the bottom section will include learning resources.
  • And on the top right-hand side will be a global search where you will be able to search for any item that you have access to in the Power BI Service.

Paginated / SQL Server Reporting Services Reports

As you can see above SSRS or Paginated reports will be coming to the Power BI Service. Not only that but it will also print them out pixel perfect.

Workspaces with Azure AD groups

As you can see above, App Workspaces will be using Azure AD Groups and not dependant on Office 365 groups. You will still be able to add permissions from Office 365 groups, but it will no longer be dependent on Office 365

Data Flows – (Formerly called CDS-A, then Data Pools, now Data Flows)

It started out being called CDS-A, then Data Pools, and now the final name according to the great people in the Microsoft Power BI Team is Data Flows. As previously described this is where you will be able to use Power Query Online to ingest data from anywhere and store them in Entities.

All the data will be stored in an Azure Data Lake Gen2, Pro license getting 10GB per user and Premium will get 100TB per P1 node), which will give you the additional capabilities to be able to let the data scientists in your organisation access the data directly from the Azure Data Lake. You will also be able to bring your own storage within your existing Azure investments.

I think that having the Data Flow will enable organisations to be able to have a single source of truth for their data assets, that can then be leveraged by the entire organisation.

What I do know from attending Miguel Llopis session is that the same runtime that is running in the Power BI Desktop will be running in the Power Query Online in the Data Flows. So that will mean you can use Power Query within Power BI Desktop to get the data in the shape that you want. Then you can go into the Advanced Editor and Copy and paste the data into Power Query Online.

Below is a rough overview of what it looks like

Data Flows Refresh

Not only will you be able to bring data in with data flows, you will also be able to refresh data with incremental refresh which will be a Power BI Premium Feature

Admin APIs

Admin APIs have come to the Power BI Service, which will allow an Admin of the tenant to be able to discover all the artefacts in his Power BI tenant.

This is great for large implementations, because up until now you had to have access to the App Workspace in order to be able to view the data. Now by being an Admin, you will see everything, which is the standard Admin capabilities.

Additional report URL parameter support

Additional report URL parameters will include filters for Date columns, new operators ” < , > , <= , >=” and multiple field values

Commenting in Dashboards and Reports

As shown above there will be the ability to comment on Dashboards and reports.

Not only that but you can include people using the @ which will then send them a notification.

Along with this you can also add comments to a specific visual, which will give great context to comments

Dataset metadata translations

If you have defined translations in the dataset or Analysis Services model, the user will see it in the locale.

Filters for Report Sharing

You will be able to share reports to users with the current filters and slicers in place for when they view the report.

Historical Capacity Metrics

If you have got Power BI Premium capacity, there will be a historical view which will allow you to see what affects performance or refreshes and queries. And see which datasets consume the most memory and make changes or plan accordingly.

Multi-Geo for Compliance

This will be where data must reside in a certain country, where a company operates around the globe. And will ensure that the data can be located in any of the Azure Data Centres, even though the initial Power BI tenant might be located elsewhere.

Not only that but in a future release this can also be used for performance by having the Power BI Premium located closer to the users.

Azure Analysis Services / Analysis Services Tabular

Below are some of the new features coming to SSAS in Azure or On-Premise in Tabular

XMLA endpoint for third-party client connectivity and lifecycle management

By opening the XMLA endpoint for Power BI, any current tool that can connect to SSAS via XMLA will also work with Power BI Desktop. Which means that some of the following tools would work immediately once the XMLA is opened.

  • SQL Server Management Studio
  • BISM
  • Excel

Not only that but you will also be able to use the TOM and TMSL in order to manage and modify configuration settings or items within your Power BI Model.

Application Lifecycle Management

As you can see above there will be the capability to be able to have full Application Lifecycle Management ALM in Power BI Desktop, Azure Analysis Services. This is a great step forward because it gives you the following capabilities

  • Source Control
  • Deployment of specific items
  • Deployments to Dev, Test and Prod
  • Potential to share parts of the data model.

Analysis Services vNext

As shown above this was Christian Wade from the Analysis Services team showing the following potential features coming to Power BI and Analysis Services.

  • Calculation Groups
  • Many-Many relationships
  • Resource Governance
    • Is Available in MDX
    • Query Memory Limitations
    • Rowset Serialization Limits
  • Data Connectivity
    • New Power Query Enterprise Data sources for Spark, Amazon RedShift, IBM DB2, Google BigQuery and Azure KustoDB

On-Premise Data Gateway Updates


There have been a lot of advancements in the On-Premise Data Gateway, with the latest release including Support for Custom Data Connectors in the Enterprise version of the On-Premise Data Gateway.

Below is a list of features coming to the On-Premise Data Gateway

  • Gateway multi-geo support for Power BI Premium
    • With the release of multi Geo Support for Power BI, this will also be enabled in the On-Premise Data Gateway
  • Guarantee high availability of gateways via clustering
    • Better support and visibility for Gateways in a Cluster
  • Improved support for Single Sign On and SAML
  • Improved data source settings experience
    • Here there will be the ability to skip the testing of the connection.
    • Rename data sources.
    • Create multiple data sources with different credentials
  • Tenant level administration of on-premises data gateway
    • This will allow the ability for tenant administrators to manage All On-Premise Data Gateways via the API or GUI
  • Basic traffic load balancing in the on-premises data gateway
    • This will start off with a basic setting to split the traffic requests between Gateways.

Using the Power Function.InvokeAfter to determine how long to wait between API calls

I have been working on a dataset which I will hopefully reveal soon, but part of that was that I was getting rate limited when making an API call.

I found Chris Webb’s insightful blog post (Using Function.InvokeAfter() In Power Query) where he details how to use the Function.InvokeAfter. The one key piece that I personally found missing was how to use this with an existing function that I had created.

I then got another fantastic question from the Power BI Community where they were looking to do an IP Address Lookup. And there are a lot of sites who offer this, but they do limit the rate at which you can query the API (Which I think is perfectly understandable considering they are offering it for free!)

My blog post shows how I ensured that I did not exceed the rate limit for the API using the Function.InvokeAfter

NOTE: I am not going to cover how I converted the IP Address to a location, I have done this previously in (Power BI Query Editor – Getting IP Address Details from IP Address)

Using the Custom Function

I am starting off where I have already created the Custom Function in Power Query Editor. I also have got a table with IP Addresses.

  • I have used a sample file in which I made up the IP Addresses as shown below.
  • I then went into the Add Column in the Ribbon and clicked on Invoke Custom Function
  • This brings up the Invoke Custom Function
    window and I put in the following information as shown below.
    • As you can see from above, I gave my new column a name of Details
    • I then clicked the drop down and selected my function I created earlier called fx_GetIPAddressDetails
    • And then finally the crucial
      part is where I selected
      my IP Address Column.
    • I then clicked Ok.
  • When you do this it returns a table as shown below.
  • Click on the Expand
    Table Button on the top right hand side, which will then prompt you which columns you want to select
  • I then selected Country
  • And below were my results.

Adding in the Function.InvokeAfter to limit the rate at which I query the API

I am now going to modify the step where I Invoke the Custom Function to limit how long it waits between API calls.

  • I created a new Parameter called “Interval (Secs) as shown below.
  • I then went to the following step in my table, and clicked on the Step called “Invoked Custom Function”
  • Then in the formula bar I have got the following.

= Table.AddColumn(#”Changed Type”, “fx_GetIPAddressDetails”, each fx_GetIPAddressDetails([IP Address]))

  • Next, I made the following changes to the above code using the Function.InvokeAfter

    = Table.AddColumn(#”Changed Type”, “fx_GetIPAddressDetails”, each Function.InvokeAfter(()=>fx_GetIPAddressDetails([IP Address]), #duration(0,0,0,#”Interval (Secs)”)))

  • I added the Function.InvokeAfter(()=>
    before I called my function fx_GetIPAddressDetails which is highlighted in BLUE above
  • I then put in the #duration(0,0,0,#”Interval (Secs)”))) which is highlighted in PURPLE above.
    • Within the #duration I also used my Parameter called #”Interval (Secs)”
      which is highlighted in ORANGE.
    • This allowed me the flexibility to change the rate limit timing without having to go into the code.
  • Now when I refreshed the data it will wait 2 seconds between each API Call.

I hope that this has been useful and an easier way to ensure that you can limit how quickly you call an API

As always please leave any comments in the area below.

Here is a copy to the PBIX file that I used in the blog post above: FourMoo – Loading IP Addresses with an Interval.pbix

Power Query Pattern – Adding Spaces in Text within your data with Camel Case

In this week’s blog post, I created this Power Query Pattern, which I created to add in spaces for CamelCase text within a column.

To get this to work for you, all that you do is need to make one change to the code.

Below is what the data looked like

Then I created the following Power Query Pattern below.

#"TRANSFORM - Camel Case" =
Table.TransformColumns(
#"Removed Columns3",
{{"Operation",
each Text.Combine(Splitter.SplitTextByPositions(Text.PositionOfAny(_, {"A".."Z"},2)) (_), " "), type text}})

 

  • To use this pattern below are the following changes that you will need to make it work in your Power Query Editor
  • Line 1
    • This is my step name
  • Line 2
    • This is where I am using the Table.TransformColumns
    • NOTE: Even though this does appear to only be used for transforming columns, it works for data within a column.
  • Line 3
    • This is referring to the previous step name, which is returning the table contents
  • Line 4
    • This is my column name where I want to add in the spaces. As with my image above the column was named “Operation”
    • NOTE: This is the only part of the pattern that needs to be changed.
  • Line 5

What I did to get this to work is I was in my Power Query Editor Window

I then clicked on Advanced Editor in the Home ribbon

I then added this step into my code as shown below.

I then clicked on Done

I went back to my column and I could now see the data with the spaces after each capital word

Conclusion

As you can see I have demonstrated how Powerful Power Query (see the multiple use of Power!) is to get the data in the shape that you require.

If you have any suggestions or comments please let me know.

Power Query – Renaming Multiple Columns

I was working on a dataset where I wanted to change multiple column names using one step and not having to change them manually. Since there were over 30 columns this would be really time consuming.

Below I detail how to complete this in the Power Query Editor, which will replace all the column names for me in one step.

This once again shows how powerful the Power Query Editor can be for ETL and automation tasks!

  • This is what the Original Column names looked like
  • To get this working, I had to go into the Advanced Editor and manually add a step.
  • Here is the Syntax that I added with my Step Name

    #”Rename Column Names” = Table.TransformColumnNames(#”Changed Type”, (columnName as text) as text => Text.Replace(columnName, “Sales – “, “”))

  • As you can see above my previous step name was called #”Changed Type”
  • Then the only other change I had to make was for the Text.Replace and what I was searching for and what I wanted to replace it with.
    • With this example I was looking for “Sales – “, “”
      • As you can see I searched for Sales – and replaced with “” (which is blank).
  • And this is what it looked like after manually adding in the above step.

As I have shown above a quick and easy way to rename multiple columns at once.

You can find the sample file here: Renaming Multiple Columns.pbix

As with every blog post if there are any comments or suggestions please leave them in the section below.

BI-RoundUp – Power BI (On-Premise Gateway Update – Developer Community Update)

Here is this week’s BI-RoundUp. I am hopeful that by this time next week the latest version of Power BI Desktop will be released!

Power BI – On-Premise Gateway Update

By far the biggest news is the capability to use Custom Data Connectors in the Personal Versions of the On-Premise Data Gateway. The Power BI Team did indicate in the blog post that support in the Enterprise version will be coming in a few months. This is fantastic to see it being made available.

There now is single sign on support using Kerberos for SAP Business Warehouse Server

And finally, there is the updated version of the Mashup Engine to the April version of Power BI Desktop

Here is a link to the blog post: On-premises data gateway April update is now available

Power BI – Developer Community Update

As you can see above for the Developer community update there now is the capability to leverage Custom Report Tooltips and the Q&A explorer.

There is now also new Azure Resource Metrics which allows for better metrics to understand how your Power BI Embedded application is working with the following metrics.

  • It will show Memory usage being used
  • It will show when there is Memory Thrashing, which is defined as when a report is trying to be run, and it needs to evict another dataset in order for your dataset to be loaded into memory. And this only applies to imported datasets.
  • It will show QPU High Utilization which shows the Query Processing units every minute on your resources when it is about 80%

There is now the capability to set up alerts on your Azure resource.

Finally, there is a new learning channel to create custom visuals with a hands-on lab.

All the details of the blog post are here: Power BI Developer community April update

BI RoundUp – Power BI (March Desktop Update, Share Content with anyone via email, Persistent Filters, Mixed Reality App, Service & Mobile Summary for Feb, Data & BI Summit)

After getting back from the MVP Summit in Bellevue, which was an incredible experience the Power BI team has been extremely busy getting some great new features out the door.

Which means there is quite a lot to cover in this week’s BI-RoundUp.

Power BI – March Desktop Update

Once again, I must repeat what I have said before, the Power BI team has once again released some amazing new features. Which I think is continuing to make Power BI such a compelling and amazing product to not only use, but to show potential customers.

As I do with every month’s update, I will highlight the features which I feel are worth mentioning.

Reporting

Without a doubt the Report Tooltips is a game changer. Not only does it give an almost drill through type of feature, but it is context aware. So, when you enable the tooltip it will take into account all the filters that are being applied and apply that to the tooltip. This allows users to gain more insights without having to navigate to another page.

Bookmarking is now generally available, which means that it is now fully supported. As well as you will no longer need to go into the Preview Features in Power BI desktop to enable the feature.

Additional formatting improvements for this month is for the table and matrix, where you can now modify the display units to show it in a more meaningful way. As well as you can set the precision in terms of decimal points. One thing to note is that it will override the settings that were defined in your model.

You can now turn off the Visual headers for the entire report. This is particularly useful when you want to embed the reports. This will remove the Context Menu’s, the Focus Mode, drill down, and the option to pin the visuals. This could potentially also make the pages a read only style report.

When you are now adding in new visuals into your Power BI report, it will now look at the existing visuals and place it where it thinks it should logically go. This is great if there are already a few visuals where it can save you some time placing the visual where you wanted it to go.

Custom Visuals

There are a whole host of new Custom Visuals which are detailed below.


Mapbox visuals which have a lot of amazing options for viewing maps

User List by CloudScope

Timeline by CloudScope – Similar to a Twitter like feel

KPI Chart by Akvelon

R DataTable

Outliers Detection

Data Insights by MAQ Software

Dumbbell Chart by MAQ Software

Clustering using OPTICS by MAQ Software

Data Connectivity

There have been improvements to the SAP HANA connector.

Both the SAP BW Direct Query and Azure Analysis Services are now generally available.

Other

The error reporting has now been integrated with Windows Error reporting. What this means is that it now will contain more information. You can also view all your previous errors. As well as if you do not have the time to send the error report right away you can send it later.

You can find all the detailed information in the blog post: Power BI Desktop March Feature Summary

Power BI – Share Content with anyone via email

You can now Share your Dashboards, Reports or Apps with any email address including Outlook.com (Live.com, Hotmail.com) and Gmail. This is really great when you have users or organizations that use the public hosted email addresses.

One word of caution, is to ensure that only the people who are required to send emails externally have the requirements. And other people that do not require it, have it disabled.

You can find all the details on how to share here: Share your Power BI content with anyone by email

Power BI – Persistent Filters

One of the time-consuming tasks in the past was, when you went back to a report, if there were multiple slicers and filters, you would have to reset them each and every time you went back to the report.

This is now a thing of the past with persistent filters.

Persistent filters will now remember your Filters, Slicers, Drill downs and sort order for your reports. So, when you go back to the report it will remember them.

This is a great feature and will make the reporting and analytics experience that much more enjoyable and quick to gain the insights you require.

All the details are here: Announcing Persistent Filters in the Power BI Service

Power BI – Mixed Reality App

With the use of the HaloLens and the Mixed Reality App there now is the capability to be able to Pin Power BI reports to physical locations or to have it tagged within the App.

What this means is that if there is a user on the factory floor, they can have a Power BI report pinned into the machine, so without having to look at their phone or open a laptop they can gain insights as they are moving around.

I think that this opens a whole new avenue and the things that you can imagine are almost endless.

The blog post details are here: Power BI for Mixed Reality app now available in Preview

Power BI – Service & Mobile Summary for Feb

Whilst some of the new features have been covered before here is a quick overview.

There is now the facility to automatically install apps.

A new feature that I was not aware of is now additional
Power BI Premium Capacities for P4 and P5, currently only available in West US and Southeast Australia.

Organizational Custom Visuals can now be controlled by the Power BI Admin

The February update to the On-Premise Data Gateway which has the latest Mashup Engine, and can also refresh data sources from both On-Premise and Cloud.

And finally, updates to the mobile app where you can now share from the Mobile App, you can drag your finger across your screen to get the tooltips and improved hyperlinks

All the details are here: Power BI Service and Mobile February Feature Summary

Data & BI Summit in Ireland

The Data & BI Summit in Ireland will be happening from 24 – 26 April 2018.

They have recently announced the keynote speaker John Doyle, Senior Director of Product Marketing on the Cloud & Enterprise team at Microsoft.

This looks to be a great conference and would be well worth attending if you are able to.

All the Summit details are in this blog post: Data & BI Summit announces keynote: The Next Era of Analytics with Microsoft Power BI

Power BI – Finding the data type when using Variables in Measures

I have been working with an Organization where I have had to create some rather complex measures. And within these measures I am using a fair number of variables.

I found that at times it was a challenge to understand what data type was being returned by the variable. Which at times would lead to a different result, because the Variable data type being returned was different from what was expected.

Example

For example, in my measure I wanted to get a date range, if the variable was not a date or date/time data type then my measure would fail.

So below I will show you how to easily find out what the Variable data type is.

Last Year Sales = 
VAR Date_MaxDate =
    MAX ( 'Orders'[Order Date] )
VAR Number_TotalDiscount =
    SUM ( 'Orders'[Discount] )
VAR Text_CustomerName =
    LASTNONBLANK ( 'Orders'[Customer Name], Number_TotalDiscount )
RETURN
    Date_MaxDate

 

  • Above is my measure and as it currently stands it will return the value for the variable Date_MaxDate which is created on lines 2 & 3.
  • Now I selected the measure name [Last Year Sales] under fields in Power BI Desktop.
  • I then clicked on Modeling in the ribbon and under Formatting it shows me that the data type is Date Time, as shown below.
  • The data type for my Variable called Date_MaxDate is Date Time.
  • If I now changed the result to be Number_TotalDiscount on line 9, it then shows me that the data type could be any numerical format
  • And finally, if I change the result to be Text_CustomerName on line 9, it then shows me that the data type is Text

Conclusion

As I have shown in my blog post above, a quick and easy way to see what data type is being returned by the variable.

If you have any comments or questions please leave them in the section below.