DAX – Getting the Start Date of the following Week

I was recently doing some work for a customer where they wanted to compare their date to the start date of the following week. They also had a requirement to persist the data to the dataset.

I thought that this would be a good challenge to use DAX with a calculated column. I most certainly could do this using the Power Query Editor and M, but I always like to see if it is possible when it is a smaller and easier DAX calculation.

Below is the DAX to get the Start date for the following week

'Date'[Date] - WEEKDAY('Date'[Date],2)+8

As you can see below, as each week goes on, I have got the start date for the following week for each day of the current week.

If I wanted to change it and use the starting date of the current week and use it for all dates for the current week, I could change it with the following DAX calculation

'Date'[Date] - WEEKDAY('Date'[Date],2)+1

As always if there are any comments or suggestions please leave them in the comments section below.

BI-RoundUp – Power BI (Desktop Release for August – New App Workspace Experience – Power BI World Tour) – Analysis Services (White Paper for AS Scalability)

Here is my weekly BI-RoundUp, and as I thought when I started putting this together yesterday that there would be a release of Power BI Desktop, which there is and not only that an updated App Workspace experience!

Enjoy the read!

Power BI – Desktop Release for August


Well that was pretty quick to get another release of Power BI Desktop. And I am sure that the release of Power BI Desktop will go back to the normal cadence of the first week of the month. With that being said let’s get into this month’s updates.

Reporting

You can now export your report pages to PDF currently only in Power BI Desktop. The Export to PDF will only export the visible pages. Currently wall papers or background images are currently not supported. This is some great functionality for those people who want to take the contents of a report and email it, or possibly print it out for a meeting or for customers.

Bookmark groups are now available, which allows you to group specific bookmarks together into a logical area. It is easy to create and allows you to group it based on your requirements such as organisational bookmarks or specific bookmarks relating to a certain subject. You can start from a bookmark group which will only play the bookmarks in that group. By starting with the default view, it will play through all the bookmarks.

Theming is now generally available, so you no longer have to enable it in the Preview Settings.

Using the List Slicer, there is now an option to Filter restatement which is similar to the restatement in the drop-down slicer.

Analytics

You can now use conditional formatting by values. What this means is that if your value has got the name of a colour or a HEX code the background color or font color can be formatted to use the values in that column. This means that you can use any color you need based on your data. Not only that but you do not have to manually add each color. The colors can be part of your dataset or DAX measure.

If you use Python scripts, this is now available in Power BI. You can import your script in the Power Query Editor and then use Python visuals in your reports. Whilst the visuals will render, and change based on other visuals in your report, currently the Python visual is NOT interactive.

Q&A is now generally available which means that you do not have to go and enable it in the Preview settings. There has also been an update to the matching experience in Power BI Desktop and Power BI Premium which will allow for better results when using Q&A

Modeling

You can now add data categories for measures in the form of Image or Web URLs. This will allow you to create a dynamic URL that can then be used to go through to another Power BI report with the filter applied within the URL.

The really exciting part of using a measure with a Web URL is that it will you will be able to render SVG images as shown above, based on your data and show it as sparklines or bar charts.

As shown above there are a whole host of new DAX Statistical functions.

There are a whole lot of new and interesting Custom Visuals which are part of the blog post.

The Spark connector is now Generally available.

A sneak peak of what is coming next month is having the expand and collapse buttons for tables and matrix. And updates to the scatter chart visual which will have the ability to use the categorical axis

All the details can be found here: Power BI Desktop August 2018 Feature Summary

Power BI – New App Workspace Experience

This blog post from the Power BI Team has got a lot of content, so I am going to highlight some of the new features in the new App Workspace Experience.

  • No longer will it rely of Office 365 Groups for the creation of an App Workspace.
  • You will be able to manage access from security groups, distribution lists and multiple existing Office 365 Groups.
  • New additional roles to manage access and permissions to the App Workspace.
    • Admin Role, Member Role, Contributor Role and Viewer Role
  • New Admin API which will give all the details of the new App Workspace details.
  • Associate a workspace to a OneDrive for Business

It is interesting to note that the new App Workspaces will not be brining across Content Packs. With that being said the Power BI Team is still working to bring across the full parity from Content Packs to Apps.

There also will be a migration plan to migrate all existing App Workspaces to the new App Workspaces which will be available at General Availability.

The detailed blog post can be found here: Enable your team with new workspace experiences (preview)

Power BI – Power BI World Tour

The Power BI World Tour will be coming to a city near you. Please see the link below with more details on which cities it will be happening in, as well as other event details.

I am very fortunate to be attending the Power BI World Tour in Sydney where I will be presenting 2 sessions Common Power BI Challenges and How to Overcome Them and My Power BI Report Has Hit The Size Limit, Now What?

I attended last year, and it was a great event, so if you can make it I would highly recommend it.

Here is the link to more details: 2018 Power BI World Tour – Powered by Your Local User Group

Analysis Services – White Paper for AS Scalability

To be honest I have already this white paper which is only 11 pages long, but it contains a lot of really useful and insightful information.

I always enjoy reading these white papers and I always learn something new. And this time it was to sort the dimensional tables by Surrogate Key (SK) to ensure that the engine will use Value Encoding.

I suggest reading through this because the same engine is used in Power BI Desktop, so you can apply the same techniques to your Power BI Model.

The White Paper Link is here: Whitepaper on modeling for AS tabular scalability

Convert from Seconds to Minutes with a DAX Measure

I was recently doing some consulting work for a customer and they had a column which contained seconds and whilst this is useful it is better to have it shown as Minutes and Seconds, which is easier to relate and compare.

Below I will show you how I came up with the DAX measure which converts it from seconds to minutes, and I will be using my Parkrun sample dataset. As shown below I have a table which shows the total seconds

Below is my DAX measure with an explanation on how I got to the measure

Secs to Mins = 
VAR TimeSecs =
    SUM ( 'ParkRun'[Total Seconds] )+1
VAR TimeMins =
    DIVIDE ( TimeSecs, 60 )
VAR JustSeconds =
    TimeMins - TRUNC ( TimeMins )
VAR JustMinutes =
    TimeMins - JustSeconds    
VAR JustSecondsInTime = JustSeconds * 60
VAR MySeconds2 =
    IF ( JustSecondsInTime < 10, FORMAT ( "0" & JustSecondsInTime, "00" ), JustSecondsInTime )
RETURN VALUE ( JustMinutes  & "." & LEFT ( MySeconds2, 2 ) )
  • Line 1
    • This is where I created the measure with a name
  • Lines 2 – 3
    • This is where I am summing up all the Total Seconds
  • Lines 4 – 5
    • I am creating a variable called TimeMins and then dividing the seconds by 60 to get the minutes.
  • Line 6 – 7
    • I am creating a variable called JustSeconds, and I am using the TRUN DAX function, so that this returns the remainder of the variable TimeMins, which is currently stored as a decimal.
    • NOTE: The above means that the remainder is stored as decimal point out of 100. So if the remainder is 0.50 that translates to 30 seconds. Which is shown below.
  • Line 8 – 9
    • I am creating a variable called JustMinutes, where all that I want is the Minutes, so I do this by subtracting the TimeMins from JustSeconds
  • Line 10
    • I am creating a variable called JustSecondsInTime, and what I am doing is converting the seconds from a decimal value back to time.
    • What this means is that if the decimal value was 0.50, the variable JustSecondsInTime will be converted to 0.30 which represents 30 seconds.
  • Line 11 – 12
    • I am creating a variable called MySeconds2, and what I am doing here is to ensure that if it is less than 10 seconds to add in a zero before the second number. This is so that it will be displayed as expected.
    • As shown below you can see where the seconds are less than 10 it is displaying correctly.
    • When looking at the image above it does appear that the seconds is not being displayed correctly. This will be resolved in the next step below.
  • Line 13 – 14
    • Now I am closing off the variable section to return the measure.
    • I begin with the DAX function called VALUE, this is because I want my DAX to be a measure in my model.
    • I then put in my variable called JustMinutes which is the minutes from the seconds that I defined in lines 8 & 9
    • Next, I put in a dot “.” This is to ensure that the VALUE will still work because it expects a value in the format of 99.22
      • If I had to change it from a dot to the colon the VALUE function would return an error.
    • I get the seconds to display correctly by using the DAX function LEFT ( MySeconds2, 2 )
      • What I am doing here is using the LEFT DAX Function to get the first 2 values starting from the left hand side. Which results in displaying the seconds

Now when I take my measure and put it into a table I get the Minutes and seconds displaying as expected.

I found that I really had to use the DAX variables to get this pattern to work as expected. With all the screenshots above, I changed the return function to the particular DAX variable that I was working on to ensure that it was returning the value that I was expecting.

If you have any questions or comments please leave them in the section below. I do hope that you found this useful.

Review of new Features coming to Power BI by Oct 2018 – From Business Applications Summit

I was fortunate to attend my first ever Business Applications Summit in Seattle.

I had the pleasure of sharing an Airbnb with Matt Allington, Phil Seamark and Miguel Escobar, it was great to spend time with these Power BI Legends. I also did meet a lot of people and got to chat with people in the Microsoft Power BI team, which was something I will always remember.

I also was fortunate to present 2 sessions (Unfortunately Reza and Leila could not make it) at the Business Applications Summit, both sessions went well, and the feedback was positive.

Ok, so enough about me, let me get onto ALL the new features that are planned to be coming to Power BI until October 2018.

One thing I can say is that I am SUPER excited with the new proposed features that are coming, it is most definitely going to make Power BI the go to BI tool going forward.

Along with this it is also growing up, and by that, I mean more enterprise features are coming, which means that it soon will be able to be implemented into a smaller organisation or a fortune 500 company.

Personally, I cannot wait to learn all the new features and start to implement them at customers. One caveat is I am certain that some features might take a bit longer to get into the service or could possibly change.

NOTE: This might be a bit of a longer post, so buckle up, here is the link to where I got a lot of the information: Overview of Business Intelligence October ’18 release

Other Features not mentioned in any of the notes

Below are some of the other features that I did not find in the release notes but there were some demo’s or pictures.

Print to PDF

As you can see above, coming to Power BI will be the ability to print to PDF which will look exactly like your Power BI report.

Display Folders – Multiple selection settings

As you can see above there is the ability to be able to set multiple measures or columns into a Display Folder.

Not only that but you will also be able to complete the settings over multiple columns.

And something not in this picture is the capability to see multiple data source views if you have hundreds of tables to make the data modelling experience easier.

Python Support

As you can see above there is Python Support coming to Power BI!

Personal Bookmarks

As shown above, there will be personal bookmarks coming to Power BI

Power BI Desktop

Below are all the Power BI Desktop upcoming features

Ad-Hoc Data Exploration

What this will be a user who does not have edit access to a report will be able to look at a chart on a different axis or change the chart type which could be more meaningful to the user.

It will be used with an option to choose a report and select “Explore from here”

Aggregations

This is a big change for Power BI where there the underlying data is a really large dataset that is stored in Spark or a SQL Server database. When connecting with DirectQuery you will be able to define aggregations which will cache just the aggregated data into memory into your model.

This will allow to have a dual mode so that if the query can be answered by the aggregated cache that will be used, and if not, it will then query the underlying DirectQuery source.

Composite Models (Available Now)

What Composite Models allow you to do, is to allow you to have data in one Power BI Desktop file where you are getting data from DirectQuery and imported data sources.

This is an amazing feature and I know something that a lot of people have been asking for.

With this you also now by default will have all relationships set to Many:Many.

As with the details it is always suggested to ensure that your DirectQuery source has been tuned and has the capacity to be able to answer the queries from your users to ensure that the users get the super-fast reporting experience.

Currently the Composite Models do not support LiveConnection sources which only relates to SQL Server Analysis Services (SSAS) Multi-dimensional or Tabular.

Copy data from table and matrix visuals

Coming to Power BI Desktop and the Power BI Service once implemented will be the ability to be to copy data from a matrix or table into another application.

Custom Fonts

You will soon be able to use any fonts that you want in your Power BI reports. All that will be required to work is that the same font will need to be installed on their computer in order for them to see it. If they do not have it installed it will fall back to the default font.

Expand / Collapse in Matrix Visual

As shown below you can see the upcoming option to expand or collapse rows in a matrix visual. There also was the indication that they want to bring more pivot table features from Excel to Power BI.

Expression-based formatting

What Expression based formatting is, is where by using DAX you will be able to format almost anything in your Power BI report. The potential is to use expression-based formatting for the following below and possibly more that I cannot think of

  • Titles of Visuals
  • Line widths of visuals
  • KPIs based colours

From what I did see, there will be an fx button next to almost everything in the visual properties and elsewhere.

Q&A Telemetry Pipeline

This will allow access to the Q&A telemetry to see what the users are using Q&A for, which will allow you to further customize your Q&A Linguistic settings. The data will first be scrubbed for PII data.

Dashboard and Report wallpapers

Coming to both the Power BI Dashboards and reports will be the ability to use wall papers to cover the grey area behind your reports.

Show measures as clickable URLS

As you can see above you will be able to create a link, with a measure so that this can dynamically link to a Power BI Report to any other application which you can access via a URL

Theming over report level and visual container styles

There will be a theming update coming to both report level and visual containers in Power BI. From what I understood it would be similar to the theming that you can currently do in Power Point.

New Power Query Capabilities

There are a whole host of Power Query Updates as detailed below.

Intellisense support for the M formula language

Intellisense will be coming to the advanced editor in Power Query. This is something that I know a lot of people have been asking for. Not only that by Power Query will be coming to Microsoft Flow also.

Smart Data Prep

There are smart data preparation coming to Power Query, with the following initial features below.

  • Data extraction from semi-structured sources like PDF files.
    • This is something a lot of people have been asking for and I have seen it in action and it is awesome, where it can take data out of tables in a PDF and extract it into Power Query.
  • HTML pages.
    • A smarter experience to understand what details you want from an HTML Page.
  • Fuzzy-matching algorithms to rationalize and normalize data based on similarity patterns.
    • This is where it will try and match data based on your columns, to try and guess what the value should be when say the data is miss spelled.
  • Data profiling capabilities.
    • As you can see from the above image, there will be data profiling which will enable you to have a look and see if the data is as expected.
    • An example is if you know that your Customer Number should only be 5 characters long, with the data profiling you will be able to see if it is meeting your criteria.

Power Query Community Website

As with Power BI, there will be a Power Query Website launching later this year.

Certified Custom Connectors

There will be certified custom connectors which will be available to be plugged into Power Query. As part of this process the custom connectors will be certified Currently there are no additional details on how this process will be completed, but no doubt it will explained as time goes on.

Power BI Service

Below are all the updates to the Power BI Service.

Power BI Home

Power BI Home is a new place for users to start their Power BI Journey with the following sections.

  • The Top section will contain the users most important dashboards and reports.
  • The second section will contain their favourite items, as well as most frequently accessed reports and dashboards.
  • Whilst the bottom section will include learning resources.
  • And on the top right-hand side will be a global search where you will be able to search for any item that you have access to in the Power BI Service.

Paginated / SQL Server Reporting Services Reports

As you can see above SSRS or Paginated reports will be coming to the Power BI Service. Not only that but it will also print them out pixel perfect.

Workspaces with Azure AD groups

As you can see above, App Workspaces will be using Azure AD Groups and not dependant on Office 365 groups. You will still be able to add permissions from Office 365 groups, but it will no longer be dependent on Office 365

Data Flows – (Formerly called CDS-A, then Data Pools, now Data Flows)

It started out being called CDS-A, then Data Pools, and now the final name according to the great people in the Microsoft Power BI Team is Data Flows. As previously described this is where you will be able to use Power Query Online to ingest data from anywhere and store them in Entities.

All the data will be stored in an Azure Data Lake Gen2, Pro license getting 10GB per user and Premium will get 100TB per P1 node), which will give you the additional capabilities to be able to let the data scientists in your organisation access the data directly from the Azure Data Lake. You will also be able to bring your own storage within your existing Azure investments.

I think that having the Data Flow will enable organisations to be able to have a single source of truth for their data assets, that can then be leveraged by the entire organisation.

What I do know from attending Miguel Llopis session is that the same runtime that is running in the Power BI Desktop will be running in the Power Query Online in the Data Flows. So that will mean you can use Power Query within Power BI Desktop to get the data in the shape that you want. Then you can go into the Advanced Editor and Copy and paste the data into Power Query Online.

Below is a rough overview of what it looks like

Data Flows Refresh

Not only will you be able to bring data in with data flows, you will also be able to refresh data with incremental refresh which will be a Power BI Premium Feature

Admin APIs

Admin APIs have come to the Power BI Service, which will allow an Admin of the tenant to be able to discover all the artefacts in his Power BI tenant.

This is great for large implementations, because up until now you had to have access to the App Workspace in order to be able to view the data. Now by being an Admin, you will see everything, which is the standard Admin capabilities.

Additional report URL parameter support

Additional report URL parameters will include filters for Date columns, new operators ” < , > , <= , >=” and multiple field values

Commenting in Dashboards and Reports

As shown above there will be the ability to comment on Dashboards and reports.

Not only that but you can include people using the @ which will then send them a notification.

Along with this you can also add comments to a specific visual, which will give great context to comments

Dataset metadata translations

If you have defined translations in the dataset or Analysis Services model, the user will see it in the locale.

Filters for Report Sharing

You will be able to share reports to users with the current filters and slicers in place for when they view the report.

Historical Capacity Metrics

If you have got Power BI Premium capacity, there will be a historical view which will allow you to see what affects performance or refreshes and queries. And see which datasets consume the most memory and make changes or plan accordingly.

Multi-Geo for Compliance

This will be where data must reside in a certain country, where a company operates around the globe. And will ensure that the data can be located in any of the Azure Data Centres, even though the initial Power BI tenant might be located elsewhere.

Not only that but in a future release this can also be used for performance by having the Power BI Premium located closer to the users.

Azure Analysis Services / Analysis Services Tabular

Below are some of the new features coming to SSAS in Azure or On-Premise in Tabular

XMLA endpoint for third-party client connectivity and lifecycle management

By opening the XMLA endpoint for Power BI, any current tool that can connect to SSAS via XMLA will also work with Power BI Desktop. Which means that some of the following tools would work immediately once the XMLA is opened.

  • SQL Server Management Studio
  • BISM
  • Excel

Not only that but you will also be able to use the TOM and TMSL in order to manage and modify configuration settings or items within your Power BI Model.

Application Lifecycle Management

As you can see above there will be the capability to be able to have full Application Lifecycle Management ALM in Power BI Desktop, Azure Analysis Services. This is a great step forward because it gives you the following capabilities

  • Source Control
  • Deployment of specific items
  • Deployments to Dev, Test and Prod
  • Potential to share parts of the data model.

Analysis Services vNext

As shown above this was Christian Wade from the Analysis Services team showing the following potential features coming to Power BI and Analysis Services.

  • Calculation Groups
  • Many-Many relationships
  • Resource Governance
    • Is Available in MDX
    • Query Memory Limitations
    • Rowset Serialization Limits
  • Data Connectivity
    • New Power Query Enterprise Data sources for Spark, Amazon RedShift, IBM DB2, Google BigQuery and Azure KustoDB

On-Premise Data Gateway Updates


There have been a lot of advancements in the On-Premise Data Gateway, with the latest release including Support for Custom Data Connectors in the Enterprise version of the On-Premise Data Gateway.

Below is a list of features coming to the On-Premise Data Gateway

  • Gateway multi-geo support for Power BI Premium
    • With the release of multi Geo Support for Power BI, this will also be enabled in the On-Premise Data Gateway
  • Guarantee high availability of gateways via clustering
    • Better support and visibility for Gateways in a Cluster
  • Improved support for Single Sign On and SAML
  • Improved data source settings experience
    • Here there will be the ability to skip the testing of the connection.
    • Rename data sources.
    • Create multiple data sources with different credentials
  • Tenant level administration of on-premises data gateway
    • This will allow the ability for tenant administrators to manage All On-Premise Data Gateways via the API or GUI
  • Basic traffic load balancing in the on-premises data gateway
    • This will start off with a basic setting to split the traffic requests between Gateways.

BI-RoundUp – Power BI (Service & Mobile Update – Create Power BI reports from Excel & CSV Files in SharePoint Online)

Here is the weekly BI-RoundUp, so let’s get into it

Power BI – Service & Mobile Update

As you can see above there now is an option to share a report with the current filter and slicers that are enabled. This is great because it will mean that you can have a subscription report that will send it out every time with the applied filters and slicers.

There is the updated Gateway connections a lot easier with greater visibility to show you what you need to do to get it working.

The On-Premise Data Gateway has been updated to the June Edition of the Mashup Engine.

And finally, for the Mobile Report canvas they have increased the amount of tiles to be larger and longer.

You can find more details here: Power BI Service and Mobile June Feature Summary

Power BI – Create Power BI Reports from Excel or CSV Files in SharePoint Online

I was not aware that there have been updates in SharePoint online, where if you have got an Excel file with a table formatted or a CSV file, there is the option to now Open in Power BI.

This will then open it in Power BI and automatically be connected to your SharePoint Online files. Which in turn means when you update the files on SharePoint online, it will automatically be updated in the Power BI Service without having to use the On-Premise Data Gateway.

You can find the full blog details here: New feature: Create Power BI reports from tables in Excel files and CSV files

Creating a Slicer that also contains a DAX Measure

I got an interesting question from Vijay asking, “How do we do a slicer with data count?”

This is what Vijay was looking to achieve as shown below and I always enjoy a good challenge.

As I always like to do, is to show how I got to the required result.

  • I currently have got data from my website, which I look at daily to see how my blog posts are tracking.
  • In my dataset I like to see where in the world people are coming from to visit my blog or website.
  • Based on the requirement from Vijay I created a slicer which takes the total sessions per country, which I can then use as a Slicer on my data.

The first thing that I did was to create a calculated table which will have the information I require for the Slicer.

In Power BI Desktop I went to the Modeling Ribbon and clicked on New Table

This will be the Country and Sessions. I created it with the following DAX Syntax below.

Slicer Table =
    ADDCOLUMNS (
        SUMMARIZECOLUMNS ( 'All Web Site Data'[Country] ),
        "Sessionszz", [Sessions]
    )
  • Line 1
    • What I have done above is created a table called Slicer Table
  • Line 2-3
    • This is where I added the ADDCOLUMNS Syntax for my table.
    • NOTE: Thanks to Maxim for commenting and letting me know that I did not need the original CALCULATETABLE
  • Line 4
    • I then used the SUMMARIZECOLUMNS DAX function and put in the ( ‘All Web Site Data'[Country] column, as per my requirement to get a Count of Sessions per Country.
  • Line 5
    • This is where I created my Column Name called “Sessionszz” and my measure called [Sessions]
    • NOTE: When I use measures I only use the Square brackets.
  • Line 6-7
    • I then close off my previous DAX functions.

The result is I now have the following calculated table.

How I had the data in order to create a new calculated column which will have the Country with the Total Sessions. I did it with the following DAX Syntax below.

Country with Sessions =
'Slicer Table'[Country] & " ("
    & 'Slicer Table'[Sessionszz]
    & ")"

 

And the result was I now had my new column in my table showing both the Country and the Total Sessions

Next, I created a relationship from my Fact table ‘All Web Site Data’ to my new calculated table called ‘Slicer Table’ on the Country Column.

I went to the Modeling tab again, and this time clicked on Manage Relationships

I then clicked on New and selected the following as shown below.

I then click Ok and Ok again to come back to Power BI Desktop

Now I was able to create the Slicer, which shows the Country and the Total Sessions.

I have selected the Country “Afghanistan” which has got a total of 4 sessions, and in my table, I can see the sessions and the date when they occurred.

Each time the dataset is refreshed the calculated table will also be refreshed which will ensure that the slicer values are up to date.

This was a great exercise and I personally think something quite handy to have when slicing data.

As always if you have any questions or comments please leave them in the area below.

Creating a DAX Calculated Column for Today, Yesterday and Next Working Day with Variables

Below was a question from the Power BI Community where the output was to have a calculated column in the Date Table, which would have Today for Todays date, Yesterday for Yesterdays Date, and Next Working Day (Being the following Monday)

The challenge was that for the “Next Working Day” it should only be for the next Monday and not all Mondays.

If you did not know you can use DAX Variables not only in DAX measures, but also in DAX Calculated columns.

I would like to mention that I had always like to step out my calculations when working in long or complex DAX measures or calculated columns. This video from Guy In a Cube with Patrick and Marco gives a great introduction on how to Debug DAX, in which they demonstrate how to use variables in DAX measures to debug the measures.

So below is my DAX calculated column with the explanation afterwards.

T/Y/Tom =
VAR TodaysDate =
    TODAY ()
VAR YesterdayDate =
    TODAY () - 1
VAR NextMondaysDate =
    SWITCH (
        TRUE (),
        'Date'[Date] - WEEKDAY ( 'Date'[Date], 2 )
            + 1
            >= TodaysDate, 'Date'[Date] - WEEKDAY ( 'Date'[Date], 2 )
            + 1,
        TODAY () - 9
    )
VAR NextWorkingDay =
    SWITCH (
        TRUE (),
        'Date'[Date] = TodaysDate, "Today",
        'Date'[Date] = YesterdayDate, "Yesterday",
        'Date'[Date] = NextMondaysDate, "Next Working Day",
        "Not Applicable"
    )
RETURN
    NextWorkingDay

Code above formatted with the DAX Formatter

  • Line 1 – This is where I am giving my Calculated Column a name called T/Y/Tom meaning Today/Yesterday/Tomorrow
  • Lines 2 & 3 – This is where I am creating a variable TodaysDate which today’s date using the DAX function TODAY()
    • With today’s date being 02 July 2018
  • Lines 3 & 4 – This is where I am creating a variable called YesterdayDate getting yesterday’s date, going back one day from today’s date.
  • Lines 6-8 is where I am creating a variable called NextMondaysDate.
    • I am then using the DAX SWITCH function which I find personally is a lot easier to use than an IF function.
    • My experience has also taught me that often an initial requirement is for two possible conditions, but once it is looked at more closely there are more than two conditions. And using the combination of the SWITCH and TRUE function makes it a lot easier to add in as many conditions are required.
  • Lines 9 & 10 – This line is quite interesting, because what I wanted to do was to find what Next Mondays date would be.
    • And if I had to put this single calculated column into my dataset I would get the following as shown below.
    • I actually did this as part of my debugging process to ensure that I was getting the right date.
    • As you can see above what this does is it gives me the date for each Monday in each Week.
      • For this current week the date for Monday is 07/02/2018
      • Whilst for next week the date for Monday is 07/09/2018
  • Lines 11 & 12 is where I am comparing to see if it is greater than equal to todays date.
    • If it is then use the Mondays Date.
    • In this example the output would appear to be the same as the DAX in lines 9 & 10.
    • The difference here is if you had to use it outside of the SWITCH context it would then result in every Monday having the “Next Working Day” which is not the required result.
  • Lines 13 & 14 – This is where I am putting in the ELSE condition which is set to take Today’s date and go back 99 999 days. Which should not be seen in the dataset.
  • Lines 15 – 22 – This is where once again I am using the SWITCH and TRUE DAX functions, because I have got multiple conditions to evaluate.
    • The first condition is where I am looking at my date table and where my calculated DAX column matches my variable for today.
      • If you had to not imagine it, but see what it would look like, it would look like the following:
        • ‘Date'[Date] = “07/02/2018”, “Today”,
    • The second condition is where I am looking at my date table and where my calculated DAX column matches my variable for yesterday.
      • If you had to not imagine it, but see what it would look like, it would look like the following:
        • ‘Date'[Date] = “07/01/2018”, “Yesterday”,
    • The third condition is where I am looking at my date table and where my calculated DAX column matches my variable for next Monday.
      • If you had to not imagine it, but see what it would look like, it would look like the following:
        • ‘Date'[Date] = “07/09/2018”, “Next Working Day”,
  • Lines 23 & 24 – This is where I close off my variables and use my final variable called NextWorkingDay

So now when I look at my date table I see the following below which what is I wanted to get.

As you can see above there is a previous Monday’s date, and it says Not Applicable, so I am getting the results I expect.

I hope you found this blog post useful on how to use DAX variables in DAX calculated columns, as well as how to use it for debugging your DAX code.

As always if you have got any questions or suggestions please leave them in the comments section below.

DAX – Concatenating Values only when values are selected

What I am doing with the DAX below is to only show the concatenation of values if a value is filtered. If nothing is filtered (Selected) then display text to show that nothing has been selected, otherwise display in a concatenation what has been selected.

NOTE: If you individually have to select each item, it is still being filtered so it will show all the values concatenated.

Selected User =
VAR ConCat =
    CALCULATE (
        CONCATENATEX (
            VALUES ( 'Power BI Audit Log'[User] ),
            'Power BI Audit Log'[User],
            ","
        )
    )
VAR IsItFiltered =
    IF ( ISFILTERED ( 'Power BI Audit Log'[User] ), ConCat, "No Users Selected" )
RETURN
    IsItFiltered

 

  • What is happening above with the Variable called “ConCat” is where I am concatenating the items from the ‘PowerBI Audit Log’ table and using the [User] column.
  • Then in the next Variable called “IsItFiltered” I am using the IF and ISFILTERED DAX function to evaluate if the [User] column is being filtered.
    • If it is being filtered then return the previous Variable called “ConCat” otherwise return “No Users Selected”
  • And the final piece is where I am returning the output from the IsItFiltered Variable
  • This is what it looks like when No users are selected
  • And then when two users are selected

As always if there are any questions please leave them in the comments below.

Reactivated Players DAX Pattern

I had a requirement where my customer was looking to gain some insights into their data relating to reactivated players.

The definition used for a reactivated player is a player that has previously played but has not played for a certain number of days. The player then comes back and plays again. So, by this definition the player is reactivated.

What I had to do, to get this working so that it performed well was to put this into a calculated table. This ensured that I only had the data I required. Due to the sheer volume of data that I had to go through, trying to do this with a DAX measure proved in my example to be very slow. And I would rather create a calculated table which can perform much faster and give a greater reporting experience.

I explain below. It is quite Complex so please read carefully below with the colour coded sections

Overview: What this calculated table is doing, is counting how many days the player made an order between his last order and second to last order based on date.

NOTE: I am going to explaining this from the inside out.

Using either Power BI Desktop or Azure Analysis Services or SSAS Tabular 2017, I created a Calculated with the following syntax below.

  1. Reactivated Players Table =
  2. FILTER (
  3. CALCULATETABLE (
  4.    ADDCOLUMNS (
  5.     VALUES ( ‘Adventure Works'[Customer Name] ),
  6.      “Days Between Orders”
  7.       , CALCULATE (
  8.         DATEDIFF (
  9.          CALCULATE (
  10.           MAX ( ‘Adventure Works'[Order Date] ),
  11.           FILTER ( ‘Adventure Works’, ‘Adventure Works'[Order Date] <> MAX ( ‘Adventure Works'[Order Date] ) )
  12.          ),
  13.          MAX ( ‘Adventure Works'[Order Date] ),
  14.         DAY
  15.        )
  16.     ),
  17.   “Last Order Date”, CALCULATE ( MAX ( ‘Adventure Works'[Order Date]), ALLEXCEPT ( ‘Adventure Works’, ‘Adventure Works'[Customer Name] ) ),
  18.   “GeographyKey”, CALCULATE (
  19.         MAX ( ‘Adventure Works'[GeographyKey] ),
  20.         ALLEXCEPT ( ‘Adventure Works’, ‘Adventure Works'[Customer Name] ) )
  21.     )
  22.  ),
  23.  [Days Between Orders] <> BLANK ()
  24. )

          Lines 8 – 11
    • This DAX Calculation is getting the second to last date the player made an order.
      • This could be called “Second to Last Order Date”
    • Line 12
      • This DAX calculation is getting the last date the player made an order.
        • This could be called “Last Order Date”
      • Lines 7-8 & 13-15
        • This is the outer DAX CALCULATE which is using the DATEDIFF to count how many Days it was between the two dates above.
        • EG: DATEDIFF(“Second to Last Order Date”,” Last Order Date”,DAY)
      • Line 6
        • This is the name of the column that I am creating in my Calculated Table called “Days Between Orders”
        • Line 5
          • Here I am getting the VALUES for the column which is the Customer Name
          • NOTE: It is important here to always use the Customer Name or CustomerAccountKey in order to get back the values you are expecting
        • Line 17
          • I am creating another column called “Last Order Date” with another DAX measure
          • This column is getting the last time the player made an order. And to do that I had to use the ALLEXCEPT so that it would break the filter context of the entire table and do it for each Customer Name
          • The reason for this column is so that I can join this back to the Date table in my dimension model
          • NOTE: This is so that I can then use the Date table to filter multiple tables when accessing the reports.
        • Line 18 – 21
          • Once again I am creating another column called “GeographyKey” with a DAX Measure
          • I created this column, so that I can get the GeographyKey for each Customer Name. And to do that I had to use the ALLEXCEPT so that it would break the filter context of the entire table and do it for each Customer Name
          • The reason for this column is so that I can join this back to the Geography table in my dimension model
          • NOTE: This is so that I can then use the Geography table to filter multiple tables when accessing the reports.NOTE: If I wanted to create any more columns to link back to the Dimension tables they would be done in this section.
        • Lines 3-4 & 22-23
          • This is where I am specifying to ADDCOLUMNS to my CALCULATETABLE syntax. Which creates the in-memory table.
        • Lines 2,24-25
          • The final piece is what makes this table so fast. Is I am then filtering my entire table. This is so that I do not bring back records that I am not interested in.
          • I do this specifically in line 24, where I am using the column I created in lines 6 – 16 and filtering it out to say where the Days are NOT BLANK

Apologies for the formatting above, I just could not get it the way it was in Word!

Once the above is done, I then get back a subset of all the records, which then makes this table very fast to query.

Creating the Dynamic Days Between Reactivation table

To give the required flexibility to the client, I created a table using the What-If Parameters.

This then allowed the client to dynamically decide how many days there should be for a player to be deemed as Reactivating.

I then created a slicer for this table

Reactivated Players

This measure is created off the table created above. Which makes it run correctly. As well as being super-fast when being queried.

Reactivated Players = 
VAR SelectedValue =
    SELECTEDVALUE ( 'Days between Reactivation'[Days between Reactivation], 30 )
RETURN
    CALCULATE (
        COUNTROWS ( 'Reactivated Players Table' ),
        'Reactivated Players Table'[Days Between Orders] >= SelectedValue
    ) 
  • In this measure above, I am once again using the Days between Reactivation Slicer for the end user to determine how many days decide how long a player is re-activated from.
  • Then I am counting the rows where it is less than the Number of days between the bets.

Report in Action

Below is what the report looks like in action, where the user is using the Slider in order to determine how many players are reactivating.

NOTE: The reason that I put in the column called “Reactivated Players” is so that it shows that the measure is working as expected. So if the Slicer is set to 30 any player with “Days Between Orders” of less than 30, should not have a 1 next to them. And as the slider goes up, the players with the corresponding “Days Between Orders” should decrease or show less “1”

 

 

Conclusion

I have shown how to create the reactivated players DAX pattern. As well as allowing the user to dynamically select the period for reactivation.

As always, I welcome any comments or suggestions.

Here is a link to the PBIX file: FourMoo – Reactived Players.pbix

Power BI – One Measure dynamically made up from multiple measures (To show Month Estimate for current Month)

Where I am consulting there was a requirement where for the current month they wanted to display the Current Month Estimate for the current month they were in, and to show the historical sales for all the previous months.

This makes some sound business sense, since currently month would typically start off with a lower sales amount, which would increase as each day goes by. Whilst the calculation for the Month Estimate is not very complex it does show if the sales are inline of where they should be.

In my example below, I am going to demonstrate how I did this, so that to the end user it appears to be one measure, even though I am using a few techniques for it to be displayed correctly.

Working Example

In this example I have got some sales data, and I am going to demonstrate how to create the Month Estimate.

I will then create the [Sales Amount] Measure, and the [Current Month Estimate] Measure.

The [Current Month Estimate] is made up by this simple calculation:

  • Sales for the current month I am in.
  • Get the Days Gone in the current month I am in.
  • Get the Total Days for the current month that I am in.
  • For example, if my current date is 09 December 2017, then I would have the following details below.
    • Sales for Current Month = $36,819.60
    • Days Gone in Current Month = 9
    • Total Days in Current Month = 31

And then finally, I will show how create the magic measure, which will show the Sales up until the previous month (Nov 2017), and then for the current month (Dec 2017), it will display the measure [Sales Inc Est] which will be renamed to Sales Amount.

Creating the standard measures

Below I created the following measures:

Sales Amount = SUM('Orders'[Sales])
Days Gone in Curr Mth = 
VAR LastDataDate =
    MAX ( 'Orders'[Order Date] )
RETURN
    DAY (
        DATE ( YEAR ( LastDataDate ), MONTH ( LastDataDate ), DAY ( LastDataDate ) )
    )
Days in Current Month = 
VAR LastDataDate =
    MAX ( Orders[Order Date] )
VAR EndOfCurrMonth =
    EOMONTH ( LastDataDate, 0 )
RETURN
    DAY ( EndOfCurrMonth )
Current Month Estimate = 
DIVIDE ( [Sales Amount], [Days Gone in Curr Mth] ) * [Days in Current Month]
Mth Est (Inc Prev Sales) = 
VAR LastDataDate =
    MAX ( 'Orders'[Order Date] )
VAR Start12MonthsAgo =
    DATE ( YEAR ( LastDataDate ) - 1, MONTH ( LastDataDate ) - 1, 1 )
VAR LastDatePreviousMonth =
    EOMONTH ( LastDataDate, -1 )
VAR SelectedSalesPrevMonths =
    CALCULATE (
        [Sales Amount],
        FILTER (
            'Date',
            'Date'[Calendar Date] >= Start12MonthsAgo
                && 'Date'[Calendar Date] <= LastDatePreviousMonth
        )
    )
VAR SelectedMthEst = [Current Month Estimate]
RETURN
    VALUE (
        IF (
            ISBLANK ( SelectedSalesPrevMonths ),
            SelectedMthEst,
            SelectedSalesPrevMonths
        )
    )

The final measure is the one that I feel needs the most explaining, which I will attempt to do below, and I hope it makes sense to you the readers, as it does make sense to me!

  • In lines 2 & 3 is where I am finding creating a Variable LastDataDate which is the last or max Order Date.
    • The reason for this is often data is not always as fresh as it could be. Meaning that it is often a day behind or if some other process upwind has failed it could be a few days behind. This ensures that I am getting the max Order data from my dataset.
  • Lines 4 & 5 is where I am creating a variable Start12MonthsAgo that will be the First date of the month from 12 months ago, based on my Variable LastDataDate
    • Here I used the DAX DATE function to get the date, and based on my data this would return the following date based on the 09 Dec 2017
  • Lines 6 & 7 is where I am creating the variable LastDatePreviousMonth which will store the last date of the previous month, based on my Variable LastDataDate
    • Here I used the DAX EOMONTH function to return the date based on 09 Dec 2017
  • Lines 8 to 16 is where I am creating the variable SelectedSalesPrevMonths for the Sales for the previous months prior to the current month I am in.
    • As with this example I want to get all the sales using the two variables I created Start12MonthsAgo (11/01/2016) and LastDatePreviousMonth (11/30/2017)
    • As you can see below this is the virtual values which will be stored as part of this variable
  • Line 17 is where I am mapping the previous measure [Current Month Estimate] to the variable SelectedMthEst
    to make it easier for readability.
  • And finally, on lines 18 to 25 is where I am using all the above logic to dynamically decide which measure will be for the previous 12 months, excluding the current month. And then where the SelectedMthEst
    will go.
    • One line 19 I must define it has a VALUE which ensures that the measure returns it as a Number value and not a Text value.
    • Line 20, is where I use an IF Statement to determine what variable to display based on the conditions I specified in my variables.
    • Line 21, is where I am evaluating the ISBLANK based on my variable SelectedSalesPrevMonths
      • This is where it might become a bit confusing to some people, because virtually when I create the table there will be no value for the current month for the SelectedSalesPrevMonths measure because it stops at the end of Nov 2017.
      • In the table context where I have got dates going past Nov 2017, the measure SelectedSalesPrevMonths will have BLANK values.
    • Line 22, is where if the measure SelectedSalesPrevMonths is indeed BLANK, then use the measure SelectedMthEst
      which once again is for the current month.
    • And finally, in line 23, is where the IF condition is evaluated to be FALSE, then use the measure SelectedSalesPrevMonths
    • After which I close all the required DAX functions.
  • Hopefully this image below explains it better below, where I have visually shown where the SelectedSalesPrevMonths ends and where the SelectedMthEst starts

Measure in Action

Now that I have gone through and explained how it all works, it is best to see it in action.

  • Let’s suppose that the current date was the 11 Dec 2017, this would then show the entire sales as shown below, including the [Current Month Estimate] measure, where it currently is estimating a good month for Dec 2017
  • Whilst if it had to be later in the month say just before the break on 22nd December 2017 it would then reflect the following below. Which is now showing that potentially sales are down.
  • And as you can see above it shows quite a different picture to the previous one.

Conclusion

What I have demonstrated here, is how to use variables and multiple measures, so that to the end user it appears as one measure. Not only that, but it completes their exact requirement, and can give them a quick indication on how their sales are going as the month progresses.

As always, any comments or suggestions are most welcome in the comments below.

Reference creating the date table: Power BI – How to Easily Create Dynamic Date Table/Dimension with Fiscal Attributes using Power Query

Here is a link to the Power BI file: Substituting Measures.pbix