BI-NSIGHT – Power BI GA (24 July 2015) / Updates – SQL Server 2016 CTP2.2 (SSRS Treemap & Sunburst Charts)

This week has been another amazing and interesting week with regards to Power BI.

So here are is the latest information that I could get my hands on!

Power BI – GA (General Availability)

This week finally Power BI will go into General Availability. Which has been a long time coming and something that I have been looking forward to ever since it was announced.

I can remember reading about all the changes that are going to be coming out this summer (Northern Hemisphere), and eagerly waiting to see what these changes would be. Would some of the items that I have voted for be in the upcoming updates. And indeed some of them are. Which is great and shows how Microsoft has started to listen to what the customers and users are looking for.

In my opinion I do think that in a year’s time, Power BI will be an amazing product with a whole host of new features and usability that we previously had never thought of!

It is great to see that they are going to be releasing updates in the same way as they have done with Power Query. Due to this being out of the Office and particularly Excel, it means that they can make changes, updates and new functionality which will be rolled out on a regular basis. And what that means of both the customers and the developers is that we get an ever evolving product.

One of the things that I wanted to highlight is that I did test out the new Rendering engine. And not only does it look amazing, with so many customizations, it is also lightning fast. I mean really fast when you click on a visualization. They do speak about it in the Power BI – GA and beyond… but WOW to actually use it and try it out, it is quick.

You can either vote for existing idea’s or create your own idea here: Microsoft Power BI Support

As well as if you are looking for help or want to help out they have also created a fantastic community which can be found here: Microsoft Power BI Community

And finally you can read all about the Power BI GA here: Power BI – GA and beyond…

Power BI Updates

Along with the Power BI GA (General Availability) coming at the end of the week, the extremely busy people at Microsoft have started rolling out some of the changes into the online Power BI Service.

There are quite a few changes which I will put in point form below, which you can read about in full at the bottom of this post.

  • Team Collaboration
  • Organizational Content Packs
  • Free Trial for Power BI Pro
  • Bring While Excel File/s into Power BI
  • Upload CSV Content
  • Replace Excel, Power BI Desktop, and CSV Files into Power BI
  • Row-Level Security For On-Premises AS Tabular Model.

There is quite a bit of information above, and some of it is based around an organization and some of the updates are based for end users. But they are all amazing features and something that I can see being used on a daily basis.

I have to highlight importing Excel Files into Power BI from OneDrive. This is another great feature, the reason for this is that not everyone uses Power Pivot in Excel. A lot of people are used to using the Sheets within Excel and this is where their strengths are. So providing the ability for people to upload their Excel files and be able to consume it from within Excel and then visualize it is another avenue to show the value of the work and their data that they created!

You can read about it here: Power BI Weekly Service Update

Here is another screenshot of the Text Box from Jen Underwood (@idigdata)

Power BI – Limit of 10 Reports per Dataset

I came across the blog post by Adam W. Saxton which describes an error some users were getting when trying to save a report and getting the above error.

It appears that currently there is a limit of 10 reports per dataset. I personally think that this should be sufficient. Due to the fact that you can have multiple pages within each report. As well as if there is a requirement to create more than 10 reports, there must be a lot of data to visualize, and there could be the potential with all the reports for the impact of the visualizations to be lost with so many reports.

The one thing to note from the details below, is that it is for each dataset ONLY. If you create a new report on a different dataset, you then will be able to create another 10 reports!

As well as this applying to both the Free and Pro versions.

You can read about this limit here: Limit of 10 Reports per Dataset/Data Model

SQL Server 2016 CTP 2.2

Browsing through Twitter this morning I came across an update to SQL Server 2016, and there is an update with regards to CTP2.2

I quickly found the details on the SQL Server Blog, which I will post the link to afterwards. I had a quick read through to see if there was anything new relating specifically to BI.

I was happy to see that they included two new chart types. Namely Sunburst and Treemap. This lead me to quickly create a VM and see what they actually look like!

Unfortunately due to time constraints and not really knowing where to exploit the new chart types I could not actually see what they look like in Reporting Services.

There have also been some updates to MDS, which allows Compound Keys, SCD Type 2 and syncing an entity between models.

You can read all about it here: SQL Server 2016 Upgrade Advisor Preview and CTP 2.2 now available


BI-NSIGHT – Multiple Power BI Announcements – Cortana Analytics – Particle and Microsoft – Azure Data Catalog – Power Query Update

So the past week has been filled with a whole host of announcements, particularly in the Power BI space, but as well as with Microsoft’s cloud based solutions.

A lot of the information that I have found due to the current World Partner Conference!

So as we do every week, let’s get into the amazing details.

There is quite a bit of content this week, so please scroll down and then down again.

Multiple Power BI Announcements

So there was a whole stack of Power BI Announcements, which has been fantastic news for the BI community

Power BI General Availability – 24 July 2015

There has been a whole host of activity with regards to this fantastic news, as well as the blog post from Microsoft detailed below. So I will quickly give my personal view of this news.

I do think it is the perfect time for Microsoft to from a Preview product to General Availability (GA). It ties in with their release of Windows 10, and being summer in the United States I am sure it is an amazing time there. Also by Power BI coming out of Preview and into an actual product this gives it some additional credibility and traction within a very competitive market.

In terms of Power BI as a whole it has been a whole new way that Microsoft have released a product. Along with this they have actively engaged with the customers and end users and actually listened and often taken the advice from the community and put this into the product. I find this amazing and something that has not been done in the past. And this is where I feel going forward is where Microsoft will leap frog the other vendors.

I also think it is a VERY smart move to open the Visualizations and give the power to the developers. This will result in a massive increase in new charts that were previously never used or never thought of. And this all lends itself to people who want to implement their own charts to start looking at Power BI. Along with this it also continues with the Microsoft community, as I pretty sure without doubt that people will share their visualizations with each other. This means not only does Microsoft benefit, but also the customer or end user.

I might have totally misunderstood all the hype around the above, but I am sure somewhere that there was a hint that there might be a content pack for Twitter data. Now that would be something that would be amazing!

If you want to read about all the details and updates, please read their blog post: Announcing Power BI general availability coming July 24th

Power BI New Capabilities

Microsoft have been super busy and they have released a great blog about the new capabilities that have been released with the General Availability coming up on 24 July 2015

Below are the sections, and you can read the blog post afterwards for more details.

I would just like to highlight the Q&A Visual formatting. I think that this is another great feature. Due to allowing you to be able to format a Q&A question. And the reason that this works so well, is that you will have some questions that will be common amongst the business people. And in my mind you would want this to look really great. As well as potentially really POP when the question is asked. And this now gives the report developer the ability to do this.

  • Office 365 App Launcher
  • Custom URL Support
  • Storage Management
  • Q&A visual formatting

Read here for more details of the above: New capabilities added to Power BI

Power BI – Spark

Once again Microsoft have released another Content pack which I personally think keeps on elevating Microsoft in terms of competing with the likes of QlikView, Tableau etc.

By integrating directly with Spark on HDInsight it allows customers and end users to easily view their data that they have been processing within the Spark. In terms of Spark on HDInsight, that already has made the barrier of entry to use an appliance and software like Spark that much easier and quicker to deploy. Then along with this they plug this into Power BI, which then means that it is a great and powerful interface to visualize their machine learning algorithms and graph computations.

You can read more details here: Visualize big data with Power BI and Spark on Azure HDInsight

Power BI – Acumatica Content Pack

Here is another great content pack from Microsoft and Power BI.

Acumatica Cloud ERP is a cloud based platform that provides business management applications such Financials, Distribution, CRM and Project Accounting.

You can read more about it here: Analyze and Explore your Acumatica Cloud ERP data with Power BI

Power BI – Development of content pack for tyGraph

I read about tyGraph analytics getting a development pack for Power BI. This is really good news as it appears that tyGraph is used by a lot of customers. And by using Power BI it will enable the existing tyGraph customers to leverage Power BI for the reporting requirements.

You can read more about it here: tyGraph Analytics Is Announcing the Development of a Content Pack for Microsoft Power BI

Cortana Analytics

The Cortana Analytics Suite is once again why I really enjoy working within the Microsoft space. It appears to me that they are not only looking to create great cloud based products, but also innovate and create something that as a whole really makes sense for people to want to use and realize the potential of the data that they have, or data that they can create.

I really like the way they have integrated a whole host of applications, to make it easier and simpler for people to be able to use it as a product. And as with anything once you get to use a new product, get a better understanding of the product, you then want to be able to do more. And with Cortana Analytics Suite, they provide out of the box functionality, but going forward it also gives the customer or end user the functionality to dig in a whole lot deeper and create their own custom solutions.

I watched this video below and it is amazing to see how technology can enable people to make smart decisions which in turn makes us happier and healthier. In my view it also shows where this is going in the future. As well as how there are unbounding possibilities of how we can integrate things (IoT) and make quicker decisions than ever before.

Dartmouth-Hitchcock ushers in a new age of proactive, personalized healthcare using Cortana Analytics Suite – Watch the Video inspiring stuff

You can read more about Cortana Analytics here: Cortana Analytics

Particle and Microsoft

I think that this is another great partnership, especially the way we are starting to see the IoT happening and starting to gain some real traction. I also find that Particle is a great company that is enabling their customers to do things that previously were either too expensive to implement or were never thought about! And I am pretty sure further down the line, this will integrate with Power BI, or Particle will create an interface using the Power BI REST API.

You can read more about it here: Microsoft + Particle

Azure Data Catalog

This is another great cloud based product from Microsoft. The reason that I thought to put it in my blog, is that there are a lot of businesses that are looking to see how to manage their metadata catalog, as well as where the business information is stored. And this is where the Azure Data Catalog comes in.

This enables almost any business user to be able to put up some information about their data and where they got it from. As well as enriching their data source information with what they are doing with the actual data. This enables other people within the business to quickly identify where the actual source of their data is. I think that this will mean the more people that know where to get the data, the more the data will be used. Which in turn will lead to better insight into their data. Which will finally or potentially lead to business value!

You can read about it here: Announcing the Public Preview of Azure Data Catalog

Power Query Update

I just saw that there was another Power Query update for Excel.

There are a total of 6 updates listed below:

  • Improvements to ODBC Connector
  • Navigator dialog improvements
  • Option to enable Fast Data Load versus Background Data Load
  • Support for Salesforce Custom environments in Recent Sources list
  • Easier parsing of Date or Time values out of a Text column with Date/Time information
  • Unpivot Other Columns entry in column context menu

You can read about it here: 6 updates in Power Query for Excel

BI-NSIGHT – Power BI for Android Devices – Power BI Designer vs Excel – Power BI Refresh for On-Premise – Power BI Refresh from OneDrive – Power BI Interface Updates

Whilst I thought that this week might be a bit on the quiet side, I think that my previous suspicion is correct, and the wonderful people from Microsoft like to send out updates on a Wednesday.

Power BI for Android Devices

I was just browsing through twitter and I came across a tweet and clicked on the link, not entirely sure what it was about.

But it lead me to the Google Play store and I now could see the following below.

And here is the URL: Google Play Store – Microsoft Power BI

Power BI Designer vs Excel

I was reading another fantastic post by Rob Collie, where he gives some great insight for Power BI Designer vs Excel Where Rob mentions how Power BI can help out the Excel Pro’s, specifically in point number 7. What is assumed here by my interpretation is that in the future, Power BI Designer will be able to import a Power Pivot Model into Power BI Designer. I think that this would be fantastic as often I have created something in Excel, using Power Query and Power Pivot. And then a light goes off in my head, and I think this would be great to visualize in Power BI (Both for being in the cloud and mobile functionality). But when I have tried to import the Excel file, currently it only shows you any Excel tables. I do have a work around for getting your Power Pivot Model in Excel exposed within Power BI, which is the following and something that I have mentioned before. So here is the quick way to get this done.

  1. It is quite simple.
  2. All that you have to do is to create on Power View sheet within your Excel file, save your Excel file and then upload it into Power BI directly or via OneDrive.
  3. Once it opens and you go into creating your report, you will see the entire Power Pivot Model exposed!!

Give it a go, it is really quick and easy to setup. You can read Rob Collies post here: Power BI Designer vs. Excel: What’s Microsoft Up To?

Power BI Refresh for On-Premise

This is something that I have been waiting for as well as voted for some time ago. And it fills me with great pride to see this being implemented. It shows that the people at Microsoft are hearing our voices loud and clear. I have to say that this is going to be a real game changer in my opinion for the following reasons:

  • Firstly it is so simple to install and configure the Personal Gateway on your PC. Due to the installation being so simplistic, as well as not requiring Administrator privileges it means a lot more people will be able to install it and use the functionality.
  • Next is they have updated how often you can refresh your data.
    • The have chosen the magic number of 8.
    • Which to me represents your 8 hour working day.
    • Perfect to get hourly updates.
  • Integration with your domain account.
    • As we know a lot of sources integrate with your domain account, for single sign on, as well as easier controls on security.
  • Refresh history, and sending emails on failure.
    • This is another great feature that we are used to having, in terms of getting emails when things break.
    • I do like it, that it will only send you an email when it fails. So no need to clutter up your Inbox.

You can read in more detail here about all the details: Refresh for on-premises sources is here!

Power BI Refresh from OneDrive

This is something that I did figure out a while ago, which I did blog about previously. It entails using Power Update to update your Excel workbook for On-Premise, then Power Update uploading the updated workbook to OneDrive, after which Power BI will automatically update. You can read about it here: Automating refreshing of Power BI Reports and Dashboards with Excel Workbooks and Power Update It is great to officially have some more information. And a lot of companies are starting to use OneDrive as their document storage solution. So now we know that it will update your OneDrive files connected to Power BI once every hour. Which in most instances is pretty good? Another thing to note, is that from my workings with SharePoint online, it actually uses OneDrive to store all the documents. So if you have SharePoint online, you effectively have got OneDrive, which you can then connect to Power BI. (Disclaimer I am not 100% certain which subscription to SharePoint Online this is for.)

Power BI Interface Updates

From what I could see there are two additional updates within Power BI. The first is the change of the new Minty Fresh theme, which is good to see. I would not be surprised to see an option later to select your own themes as you can do with Power View in Excel. The second update is within the Dashboards in Power BI. When you click on the Tile Details button you now get an additional option as shown below You can now set a Custom Link, which I think is really great, because this could link to anything. Which gives the report developer more options, as well as allowing the Report Consumer to be able to interact with an additional dataset which is outside of the report.

BI-NSIGHT – Power BI Designer Update – Power BI Twilio Connector – Excel 2016 Visualisations – Azure Data Lake

Fortunately this week, the Microsoft machine was a little quieter, but still as expected new updated in the Power BI Space.

Power BI Designer Update


I have read quite a few blog posts, downloaded and viewed the Power BI Designer update. And I have to say they are some welcome changes.

In my mind I see that over time, they are porting some of the functionality within Power Pivot into Power BI Designer. Which is a great welcome, and whilst they are doing this, they are also making additions, such as the DAX intelli-sense, which is amazing. As well as giving the user to create a column within the Report screen. It makes the report authoring experience that much quicker and better.

I do think that the product is moving along in leaps and bounds. Especially for customers and users who do not have access to Excel 2013 and potentially Excel 2016!

You can read all about the updates here: Power BI Designer May Update

Power BI – Twilio Connector


Just a quick note, that there is yet another great connector for Power BI. I personally have not come across this before, but in my mind it is starting to show how Power BI is starting to gain momentum from other companies, which can leverage their reporting and product exposure by utilizing Power BI. As well as a great benefit for existing customers, to have a great easy to use as well as informative reporting platform.

I am certain that the Twilio connector, will show some companies a great insight into their data and real-time communications.

You can read more about it here: Visualize and analyze your Twilio data in Power BI

Excel 2016 Visualisations (Charts)


Example Treemap Visualisation below

There are some great new visualisations or new charts coming in Excel 2016. I did watch the Channel 9 video from Ignite which had more details and it did showcase the new charts and how they can enable you to view your data in a totally different way. I think that this is really welcome.

Here is a link to the video: What’s Next for Visualizations in Microsoft Office

As well as a great blog post by Clint Huijbers, in which he does a great job to show examples of the new visualisations (charts): New visualizations in Excel 2016

Azure Data Lake


I once again have to say that this is another smart move by Microsoft. In my mind this is a great product. The reason for that is now you could have the potential to run your entire organisations data within the cloud.

You could start by storing all your data, within the Azure Data Lake, thereafter if you require a data warehouse you could then leverage the Azure SQL Data Warehouse, from which you could then visualise your data using Power BI. This is a great story to be able to tell and to explain to potential customers as to why they should potentially choose Microsoft Azure.

In terms of the Azure Data Lake, from what I have read it is not to be confused with just storing your data in the cloud. It gives you the ability to quickly and efficiently get your required data out of your data lake. The great benefit is that you can store as much data as is required. You do not have to worry about the structure of your data, you can just store it in the Azure Data Lake, and then when there is a requirement to gain insights into your data, it is immediately available.

I also see that if you had to put as much data as you won, or could get your hands on, into your Azure Data Lake, when you start looking at gaining insights, it just means that you will have a whole stack of information at your fingertips. Which then means that you could potentially cross relate data, where previously you did not have it, or it was stored in another system that was hard to access, or to extract data. If you then have to couple this with Azure SQL Data Warehouse as the extraction layer, and run that through Machine Learning, you could potentially be onto something that competitors have not even begun to realize the benefits!

You can read about it here: Introducing Azure Data Lake



BI-NSIGHT – SQL Server 2016 – Power BI Updates – Microsoft Azure Stack

Well I have to admit that it seems that the Microsoft machine has been working flat out to get out new products and updates.

If my memory serves me, this is the third week in a row that Microsoft has released new products and updates. I am really enjoying it! But hopefully this will slow down, so that we can catch our breath and actually play with some of the new products and features.

So let’s get into it. There is quite a lot to go over!!

SQL Server 2016

So with Microsoft Ignite happening this week, the wonderful guys from Microsoft have started to announce what we can expect to be in the next version of SQL Server.

I am going to focus mainly on the BI (Business Intelligence) features, and there are quite a few! Some of what I am detailing below I have only seen pictures on Twitter, or I have read about it. As well as the preview not even being released yet, I am sure that there will be some changes down the line.

SQL Server Reporting Services

It finally appears that Microsoft has been listening to our cries and requests for updates in SSRS (SQL Server Reporting Services)!

Built-in R Analytics

I think that this is really amazing, even though I am not a data scientist, I think it is a smart move my Microsoft to include this. What this means in my mind is that you can now get the data scientists to interact and test their R scripts against the data as it sits in the transactional environment. And from there, this could then be used to create amazing possibilities within SSRS.

Power Query included in SSRS

From what I have seen people tweeting about as well as what I have read, it would appear that Power Query will be included in SSRS. This is fantastic and will mean now that virtually any data source can be consumed into SSRS.

New Parameter Panel, chart types and design

I am sure we can all agree, that SSRS has required an overhaul for some time. And it seems that finally we are going to get this. It would appear that the parameters panel is going to be updated. I do hope that it will be more interactive and in a way react similar to the way the slicers do in Excel. As well as getting new chart types. Here again I am going to assume that it will be similar to what we have in Power BI!

And finally there was also mention that the reports will be rendered quicker, as well as having a better design. I also am hoping that they will deploy this using HTML 5, so that it can then be viewed natively on any device. It is going to be interesting to see what Microsoft will incorporate from their acquisition of Datazen.

SQL Server Engine

Built-in PolyBase

Once again, this is an amazing feature which I think has boundless potential. By putting this into the SQL Server Engine this means that it is a whole lot simpler to query unstructured data. Using TSQL to query this data means that for a lot of people who have invested time and effort into SQL Server, now can leverage this using PolyBase. And along with this, you do not have to extract the data from Hadoop or another format, into a table to query it. You can query it directly and then insert the rows into a table. Which means development time is that much quicker.

Real-time Operational Analytics & In-Memory OLTP

Once again the guys at Microsoft have been able to leverage off their existing findings with regards to In Memory OLTP and the column store index. And they have mentioned in their testing that this amounts to 30x improvement with In-memory OLTP as well as up to 100x for In-Memory Column store. This is really amazing and makes everything run that much quicker.

On a side note, I did read this article today with regards to SAP: 85% of SAP licensees uncommitted to new cloud-based S/4HANA

I do find this very interesting if you read the article. What it mentions is that firstly 85% of current SAP customers will not likely deploy to the new S/4HAHA cloud platform. Which in itself does not tend well for SAP.

But what I found very interesting is that to make the change would require companies to almost start again for this implementation. In any business where time is money, this is a significant investment.

When I compare this to what Microsoft has done with the In-Memory tables and column store indexes, where they can be used interchangeably, as well as there is some additional work required. On the whole it is quick and easy to make the changes. Then you couple this with what Microsoft has been doing with Microsoft Azure and it makes it so easy to make the smart choice!

SSAS (SQL Server Analysis Services)

I am happy to say that at least SSAS is getting some attention to! There were not a lot of details but what I did read is that SSAS will be getting an upgrade in Performance usability and scalability.

I am also hoping that there will be some additional functionality in both SSAS OLAP and Tabular.

SSIS (SQL Server Integration Services)

Within SSIS, there are also some new features, namely they are also going to be integrating Power Query into SSIS. This is once again wonderful news, as it means now that SSIS can also get data from virtually any source!

Power BI

Once again the guys within the Power BI team have been really busy and below is what I have seen and read about in terms of what has been happening within Power BI

Office 365 Content Pack

I would say that there are a lot of businesses that are using Office 365 in some form or other. So it makes perfect sense for Microsoft to release a content pack for Office 365 Administration

As you can see from the screenshot below, it gives a quick overview on the dashboard to see what activity is happening. As well as details of other services. I am sure that this will make a quick overview of your Office 365 systems really easy to see. And also if there are any potential issues, this could also be highlighted!

Visual Studio Online Content Pack

Another content pack that is about to be released is for people who use Visual Studio Online, I personally do not currently use this. But it does look great for people to once again have a great overview of what is going on.

You can read more about it here: Gain understanding and insights into projects in Visual Studio Online with Power BI

And as you can see below, what you can view once you have got it setup within Power BI.

Power BI planned updates

Below are the updates that I had previously voted for in Power BI. It is great to see that the Microsoft team is actively listening to their customers and implementing some of the idea’s. I have to say that I do not think that there are many other software companies that are doing this currently. And also being able to roll it out as quickly as Microsoft is.

Set Colors and Conditional Formatting in visuals

  • This is great as it will allow the report authors to have more control in terms of how their reports look.

Undo / Redo button in browser & designer

  • This might seem like a small update, but I know personally from working with the Power BI reports, that sometimes you just want to see what a different report looks like. Or adding another element. And with the Undo / Redo buttons, it just saves that little bit of time, as well as to make the report authoring experience that much more enjoyable.

Power BI Announcements from Microsoft Ignite

Below are some announcements that I have read up about either on Twitter or one of the blogs that I follow. It is really great to see so many things in the pipeline.

This means that there is a lot to look forward to, as well as ensuring that we have new and wonderful things to show.

I got this picture via Twitter, which someone must have taken at the Microsoft Ignite Conference. As you can see it is not very clear, but it does show the next update in the Power BI Designer.

You can also see the undo and redo buttons.

It also appears that in the Power BI service, there will be support for SSRS files, namely the .rdl files! Here is another picture taken from Microsoft Ignite.


Then there is the Many to Many relationships and bi directional cross filtering will be supported in SQL Server 2016 tabular models, which I am sure will also be included in the new Power BI backend. As this is where it stored all the data.


For Hybrid BI, there will be support for live querying for SSAS, currently this is already in place for SSAS Tabular.


It also looks like there will be a scheduled refresh for SQL Server Databases as a source. Which is great for people who either do not have either of the SSAS cubes, or want to get some of their data into Power BI.

Microsoft Azure Stack

While this is not exactly BI, it is related to BI, in that with Azure Stack you can get the Azure functionality on your own hardware which is fantastic. And I am sure for a lot of businesses this will be welcomed.

You can read more about it here: Microsoft Brings the Next Generation of Hybrid Cloud – Azure to Your Datacenter


BI-NSIGHT – Azure Analytics Services – Power BI Dynamics AX Connector – Auto refreshing of Excel Files in Power BI – Excel 2016 Updates

Once again this past week Microsoft as released even more products. So let’s get started.

Azure Analytics Services

On 29 April at the annual Microsoft Build conference they announced new services on the Azure platform. I really think that Microsoft is doing a great job of having an entire cloud platform available for customers.

Due to the digital age and a lot of data is coming from mobile devices, which in turn are using digital services, I can see looking forward that as Microsoft says in their video, there will be a major requirement going forward for businesses to be able to use digital platforms in order to gain insights into their data.

With the Azure Analytics Services, this enables businesses to easily use products, without having to actually build the infrastructure and knowledge on the platform.

I am going to highlight the two new services which relate to BI. The Azure SQL Data Warehouse and the Azure Data Lake.

Azure SQL Data Warehouse

I really think that this is a smart move from Microsoft. It is something that we have been looking for and asking for from Microsoft. From the video (Build 2015: Azure SQL Data Warehouse) it appears to me that not only will it be a data warehouse, but also enable you to move data in and out of the data warehouse using SSIS.

I do hope that my assumption is correct, as this would be a great feature to have SSIS on a cloud platform. And this has the potential to create a whole host of new possibilities.

Along with this, it also appears from what I read that they are using the APS (Microsoft Analytics Platform System) as a basis for the Azure SQL Data Warehouse. If this is indeed true, I think that this could be a game changer. Due to the fact that we already know how easily it is to scale out with the APS. Which gives amazing performance.

Another great advantage is it looks like you can use the Azure SQL Data Warehouse as a source in Power BI. Which then enables all the features of Power BI. This combination I think would make a great competitor to other BI cloud products on the market.

Also as per the video, they have made it really easy to change the scale of computing that you require. It is really easy to complete and it can be changed in seconds!

Credit: Microsoft

Azure Data Lake

This is another clever move, allowing businesses to store all their data in a Data Lake. Which then gives the business the flexibility to then use this with the various Azure Services.

Here is a link to the video (Build 2015: Data Lake)

What I think will be great is that you can then take data from your data lake, either put this into Azure SQL Data Warehouse, or then use a Machine Learning API to run over your data. After you have got the required results, you could then put this into your Azure SQL Data Warehouse, and put Power BI on top of this to visualize your results and see based on your specific Machine Learning API, what affect it could potentially have!

You also have the potential to use existing Machine Learning API developed by other people, which means that you do not have to spend countless hours trying to get the correct Machine Learning algorithm!

Here is an overview of the Azure Analytics Services from Microsoft

Credit: Microsoft

Here is the blog post from Microsoft: Microsoft Announces Azure SQL Database elastic database, Azure SQL Data Warehouse, Azure Data Lake

Power BI Dynamics AX Connector

Just a quick note that I received an email that the Power BI guys have started planning for the integration of Dynamics AX as a connector in Power BI.

I am sure that this is welcoming news for Microsoft Dynamics AX customers

Power BI Dynamics AX – Planned

Automatic Refreshing of Excel Workbooks in Power BI

Just a quick review of my blog post this week, which was really well received.

It just explains how using a Power Query and Power Pivot, in which you use to get data into Excel Workbook, you can then have a mechanism how to get your Power BI report updated automatically.

This means that you can have your data on premise which you can then ensure gets into Power BI and then can have the potential for real-time updates.


Excel 2016 Updates

Just another quick note if you want to read up about the updates for Excel 2016, head over to Chris Webb’s blog post which has all the relevant information.

Chris Webb – What’s New In The Excel 2016 Preview For BI?

That is it for this week, another busy week and a lot going on!


Automating refreshing of Power BI Reports and Dashboards with Excel Workbooks and Power Update

I recently tested to see if it was possible to upload an Excel Workbook to One Drive for Business, and then use this within Power BI. This was successful.

Then I saw, and it appeared that if you used an Excel Workbook from One Drive for Business, that it would automatically refresh the file. Which I thought it might possibly mean that if you updated your Excel Workbook in your One Drive for Business folder it would then update your dashboards and reports in Power BI.

The outcome was that it does indeed do this. Which means that we not have a means to update Power BI dashboards and reports from our Excel Workbooks which can be stored On Premise.

This also gives us the ability to take any source information that we can get into the Power Pivot Model and get this to update in Power BI. So this also means that anything that we do with Power Query, which then goes into our Power Pivot Model (Data Model) can then be updated in Power BI!

And below I will explain how to do this using Power Update

NOTE: The reason is you could manually update your Excel Workbook, and then save it to your One Drive for Business. But ideally if we can automate the process, this then means that we can have our Power BI Dashboards and reports updating automatically.

NOTE II: If you would like to know more about Power Update, please read the following Blog Post Power Pivot Pro (Rob Collie): Introducing Power Update

  • Within the version they do have a free option!

Example Data:

  • We are going to use an Excel workbook which I used previously which had some Adventure Works data. The details of where the Excel Workbook are is listed below.

Uploading Excel Workbook with Power Pivot and Power View to One Drive for Business

Below are the steps where I upload my Excel Workbook, which has a Power Pivot Data Model as well as a Power View Report to One Drive for Business.

  1. The first thing is that I created my Excel Workbook, with Power Pivot and Power View.
  2. NOTE: The reason that I created this is so that when it is uploaded to Power BI, the entire Power Pivot Model will be exposed in Power BI.
  3. In order to do this, please refer to my previous blog post to create it with your own data
    1. Power Pivot and DAX in Power BI
  4. Next I uploaded my Excel Workbook to One Drive for Business

Importing your Excel Workbook from One Drive for Business into Power BI and create a simple report and Dashboard

Next we will import our Excel Workbook using the file which we uploaded into One Drive for Business within Power BI

  1. Log into Power BI
  2. Then click on Get Data
  3. Once this opens, make sure you select Excel Workbook, then click on Connect
  4. Now on the next screen you will see the options for Computer, OneDrive – Personal, OneDrive – Business
    1. Click on OneDrive – Business
    2. NOTE: The first time it might prompt you to log in and authenticate you to your OneDrive for Business Folder
    3. Then go to the location where you saved your Excel Workbook in the previous steps.
    4. Then click
    5. NOTE: As you will see above currently my Modified Date is from 5 days ago.
  5. Now under Datasets click on your Excel Workbook you just uploaded.
    1. Click on Explore
    2. NOTE: You will also see that the Last refresh succeeded is Tue Apr 28 12:11:34 BMT +1000
  6. We are going to create a very simple report and then a dashboard
    1. We then saved this report as AW Auto Update
  7. We then and created a new Dashboard with the same name as above.
    1. NOTE: If you create a dashboard with the same name as your report, it will automatically put the pinned
      items into that dashboard.
  8. We then went back to our report from step 6 and pinned both items to our dashboard, as shown below:
  9. Now we are at the point to use Power Update and to test the auto updating of Excel Workbooks from One Drive for Business

Using Power Update to update our data as well as Upload to One Drive for Business

In this next section we are going to configure Power Update, to refresh our data in our Power Pivot Model. And then automatically upload it into our One Drive for Business Folder

NOTE: You can download and find out how to install Power Update from this blog post by Power Pivot Pro (Rob Collie): Introducing Power Update

  • Within the version they do have a free option!
  1. Due to the way I brought the data into my Power Pivot Model in my Excel Workbook, I did go and update my source query to the AdventureWorksDW2014 database.
    1. But you would normally have your query dynamic where it is getting its source data from.
  2. For our example to show the automatic refreshing of data from One Drive for Business into Power BI, I am going to ensure that we have Sales for 2010 – 2013
  3. In the next steps I am going to show you how to configure Power Update to refresh your Excel Workbook.
    1. Open Power Update
    2. Click on New
    3. Give your Task a Name
      1. In our example it will be called AW Auto Update.
      2. Click Next
    4. For our example we set it to Daily (or Multiple times a day)
      1. Click Next
    5. On the Setup a daily schedule we just accepted the defaults
      1. Click Next
    6. On the Set up multiple times per day schedule, we set ours to run every 1 hour
      1. Click Next
    7. Now on the step for select Destination Type, ensure to select SharePoint
      1. NOTE: Even though we are using One Drive for Business the underlying
        data is stored within SharePoint. So it will work.
      2. Click Next
    8. For our example we only have a single workbook, so on the Select source workbooks, we selected Update a single workbook
      1. Click Next
    9. On the Select workbook to update, select the location of your Excel Workbook on your file
      system or shared folder.
      1. Click Next
    10. Now on the SharePoint Settings, we have to configure the following.
      1. First click on New under Credentials
        1. Make sure you put a tick next to Office 365 / Power
        2. Click
      2. Now on the SharePoint Site URL you will need to put in the following:
        1. Go to your One Drive for Business Folder in your browser
        2. NOTE: Copy and paste the entire URL, Power Update is smart and it knows which part of the URL it requires.
        3. Then click on Select
          1. Now browse to the folder location where you previously uploaded your Excel Workbook in the previous section: Importing your Excel Workbook from One Drive for Business into Power BI and create a simple report and Dashboard
        4. So that once completed it will look similar to the following below:
      3. Click Next
    11. Now the Wizard is completed and you should see the following in Power Update
  4. Next we are going to manually run this now, so that we can show how it will auto refresh the data in Power BI once the process is completed.

Automatic Refresh of Power BI Report confirmation

Below we will see if Power BI refreshed the Excel Workbook from our One Drive for Business

  1. The first place to see if our Excel Workbook has been updated via Power Update
    1. As we can see above it has been updated.
  2. Now we will have to wait and for Power BI to go and check the Excel Workbook and update the reports and dashboard
    1. NOTE: From what I read, the Dashboard tiles auto update every 10 minutes, I am not sure what the schedule is for Excel Workbooks in One Drive for Business
    2. But there must be some mechanism in place where it looks and updates Excel Workbooks connected to One Drive for Business
  3. I then went back to my Datasets and clicked on the Ellipses button and after about 4 – 5 minutes I saw the following:
    1. As you can see above the Last refresh succeeded time has changed from our previous value to Tue Apr 28 12:58:13 GMT +1000
  4. Now I then went back to our Dashboard and saw the following below
    1. As you can see from above, we now have the year 2010 (albeit small) as well as the increase in Sales


So in conclusion you now have a process to fully automate getting data from your On Premise source Excel
Workbook, and getting your reports to Automatically update in Power BI.


Power Pivot and DAX in Power BI

Introduction: Something got me thinking the other day when I used the Google Analytics Connector and saw what looked like to me was a Power Pivot Model, as well as an image. I know in the current Power BI Designer as well as when you import an Excel file into Power BI on the web front end, you do not get to see these options. So it got me thinking, how is Microsoft doing this?

After some playing around I found a simple and easy solution!

NOTE: This does not currently work with Excel 2016

What this enables you to do, is to have your entire Power Pivot Model, along with your DAX calculations, as well as the synonyms within your Power Pivot Model available in Power BI!

Below is a screenshot of my completed dashboard, after which I will explain how I did it, and then an example with the Power Pivot Model, DAX calculation, Image as well as a synonyms!

I have used data from the AdventureWorksDW2014 database to get data into my Power Pivot Model.


How to get Power Pivot, with DAX calculations and Synonyms into Power BI

You guy are going to love how simple this really is to get completed.

  1. Create your Power Pivot Model, DAX Calculations and if required your synonyms.
    1. NOTE: You can load your data from Power Query into your Data Model (which is Power Pivot)
  2. Then what I did was to first create a really simple
    Pivot Table with a chart.
  3. Next I then created a Power View report, by going into the Insert
    Ribbon and then clicking on Power View, after which I created my Power View Report as shown below
  4. Then I saved my Excel Workbook.
  5. Next I uploaded the Excel Workbook to Power BI.
    1. NOTE: I uploaded it directly from my PC, as well as from One Drive for Business and they both achieved the same results.
    2. I had the following while it was being imported
  6. Now before I go on, I wanted to show you what my Power Pivot Model looked like in Excel.
    1. From the above screenshot, you will see that I have created two DAX
      1. Sales Amount
        1. This is just a sum of the SalesAmount
      2. PM Sales
        1. This is a DAX
          Calculation that I found in
        2. I then used this and applied it to my data.
      3. NOTE: The two DAX Calculations above is to show that they can then be used within Power BI
  7. Now once I had uploaded my Excel
    Workbook I went into the report in Power BI and saw the following:
    1. As you can see above this is identical to my Power View report in Excel.
  8. Then if I click on Edit Report, I see the following in Power BI, as you will notice below, I added an additional Combo Report, using the Sales and PM Sales amount.
    1. Which as you can see above, is my original Power View report, but added an additional chart, using the DAX Calculations from my Power
      Pivot Model.
    2. NOTE: You can see that it has my DAX Calculations of Sales Amount and PM Sales available and ready to be used.
  9. As you can see from above, you can now have a fully functional and working Power Pivot Model, with most of the Power Pivot features such as DAX Calculations and Synonyms available to be used with Power BI!
  10. So in a nutshell all that you need to do is to add a Power View report within your Excel workbook. Which will then enable the Power Pivot model to be available in Power BI

For the Synonyms to work, I simply added them within my Power Pivot Model, under the Advanced tab.

Adding an Image for Power BI from Excel

  1. To get the image embedded, I simply added it to my Power View Report in Excel.
  2. Then once I uploaded my Excel Workbook, I could then Pin the image to my dashboard.

In future blog posts, I will explain how I got Dashboards working when they are not created automatically in Power BI.

Link to Excel File

Below is a link if you want to use my Excel File and then upload it to your Power BI site

Please note, the only reason for the image for my blog, is I had to put in some type of image and I could not think of anything else to put in there!

Adventure Works DW – Power BI Power Pivot Example.xlsx

SharePoint 2013 – Creating a data source to an SQL Server Analysis Services (SSAS) OLAP Cube

Below are the details when using SharePoint 2013 when you want to collect to an SQL Server Analysis Services (SSAS) OLAP cube, instead of only being able to use the SSAS Tabular.



·         In our example we are going to connect to our Adventure Works DW 2012 SSAS


1.       Go into your SharePoint 2013 website and then click on Files, then New Document and then select Report Data Source.

a.        clip_image002

2.       This will then open the Data Source Properties window where you will configure it with the following explained below.

a.        Where it says Name put in a name so that you know it is your connection file.

b.       clip_image004

c.        Next where it says Data Source Type click on the drop down and select the following:

d.       clip_image006

e.       Now where it says connection string below is the syntax that you have to use with an explanation below:

Data Source=Server\InstanceName;initial catalog=AdventureWorksDW2012;cube=’Adventure Works’

                                                                                       i.      From the syntax above the part highlighted in RED is to specify the SSAS OLAP Name and if required instance name.

1.       As with our example it was called:

a.        Server\InstanceName

                                                                                      ii.      The next section highlighted in GREEN is your SSAS OLAP Cube database name.

1.       As with our example is:

a.        initial catalog=AdventureWorksDW2012

                                                                                    iii.      The final part which you have to specify is the actual cube name, which is highlighted in PURPLE

1.       As with our example it is called:

a.        ;cube=’Adventure Works’

                                                                                    iv.      clip_image008

f.         In the next section which is the Credentials this is what will be used when running the data source.

                                                                                       i.      What we have setup in our environment is we have a static domain account where the password does not change. This is so that we can then use this account and grant it permissions to the required cubes as well as roles.

                                                                                      ii.      NOTE: We also did this because it is a lot simpler to configure the security to our SSAS OLAP cube via the standard roles.

                                                                                    iii.      So with our example we select Stored Credentials and put in our User Name and Password:

1.       clip_image010

2.       NOTE: We selected Use as Windows credentials so that it knows it is a Windows Domain account.

                                                                                    iv.      Then click on Test Connection to make sure the connection to the cube is correct as well as the credentials.

1.       clip_image012

3.       Then click Ok to create the Report Data Source.

4.       NOTE: If you are concerned about security for people using this data source to see specific data you can do one of two things:

a.        ONE: Change the above credentials to Windows Authentication.

b.       TWO: You can also limit who has access to the actual Report Data Source, Power View Report as well as the Folder that stores all the information and set unique permissions.

5.       Now as you can see below we created a quick and easy Power View Report on our SSAS OLAP Instance.

6.       clip_image014

Power View – Using MDX, SSAS Tabular and Power View to display potential issues based on past trends in data

What I am going to explain below is using the above MDX, SSAS Tabular and Power View to display potential issues based on past trends within our data.


For our example we are going to using the example of SCOM data. Within our example we are looking for systems where the available memory has dropped more than 50% of what it was previously sitting at.

·         So for example if a server when from 1024 MB of Available Memory and dropped to 400MB we would want to see this.

·         And if our SCOM Alerts only looked for servers with less than 200MB of Available Memory it would not have alerted yet.


This is because it could highlight something going on a server that is not currently an issue but COULD become an issue.


Whilst the final Power View Report might look very easy to understand and simple, it is really valuable and enables the end users to quickly identify issues.


NOTE: With the above information below it has not breached the SCOM threshold, so no alerts have been created. But by the time the SCOM Alter comes through it might be too little too late.


NOTE: You can download, install and configure the SCOM data warehouse and cube from the following location, which I have previously blogged about:


MDX Query – SCOM Cube

The first part is to create and write the MDX Query which we will later use in our Tabular Model in order to get the data we require.


1.       As with our example we created our MDX query to get the output we required.

2.       The first query that we have below is where we are looking for Servers with a memory issue as explained above

Withmember [Measures].[PreviousWeekAverage] as

       Avg((strtomember(“[Date].[full Date].&[“+format(now()-7,“yyyy-MM-dd”)+“]”):strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”)),[Measures].[SampleValue])


       member [Measures].[CurrentDayAverage] as

       (strtomember(“[Date].[full Date].&[“+format(now(),“yyyy-MM-dd”)+“]”),[Measures].[SampleValue])


       member [Measures].[Diff] as



Selectnonempty {[Measures].[CurrentDayAverage],[Measures].[PreviousWeekAverage],[Measures].[Diff] } onColumns,

nonempty {




                           ([Counter].[Counter Hierarchy].[Counter Name].&[Memory]&[Available MBytes]

                           ,[Managed Entity].[Display Name].children,strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”))


                      ,[Measures].[Diff] > -100000)

                     ,[Measures].[Diff] < -50)

                     } onRows

From [SCOM Daily]

a.        As you can see from the above query we have create the following:

                                                                                       i.      Our Measures to compare the current days average to the past weeks average.

                                                                                      ii.      We then have put in 3 filters so that it will only bring back servers where the Available Memory has dropped more than 50%, as well as ensure that we do not get any random rows where the SCOM data is corrupt or out of whack.

b.       NOTE: This query will return just the specific Display Names or Server Names.

3.       In our next query we are basing it on the query above, but this time we are bringing down all the detailed data, so that this can be put into Power View in order to see the history.

Withmember [Measures].[PreviousWeekAverage] as

       Avg((strtomember(“[Date].[full Date].&[“+format(now()-7,“yyyy-MM-dd”)+“]”):strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”)),[Measures].[SampleValue])


       member [Measures].[CurrentDayAverage] as

       (strtomember(“[Date].[full Date].&[“+format(now(),“yyyy-MM-dd”)+“]”),[Measures].[SampleValue])


       member [Measures].[Diff] as



Selectnonempty { [Measures].[SampleValue]} onColumns,

nonempty {




                           ([Counter].[Counter Hierarchy].[Counter Name].&[Memory]&[Available MBytes]

                           ,[Managed Entity].[Display Name].children

                           ,strtomember(“[Date].[full Date].&[“+format(now()-15,“yyyy-MM-dd”)+“]”):strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”))


                      ,[Measures].[Diff] > -100000)

                     ,[Measures].[Diff] < -50)

                     } onRows

From [SCOM Daily]

a.        Even though the queries look very similar there are differences.

                                                                                       i.      In the actual Select statement we are now only putting in the actual Measure of the Sample Value.

                                                                                      ii.      We then have also put in our date range, so that we can get this history for the past 15 days.

4.       We will then put these queries into SSAS Tabular in the next steps.


 SSAS Tabular

Next we are going to create our Tabular model in SSAS so that we can then use this in order to create our Power View Report.


1.       Open SQL Server Data Tools (SSDT) and create a new Analysis Services Tabular Project

a.        clip_image002

b.       We have it the name:

                                                                                       i.      SCOM Potential Server Issues

2.       Create a new Import Data Source and select the following:

a.        clip_image004

b.       Then configure this to connect to your SSAS Instance.

                                                                                       i.      With our example we connected to our SSAS Instance where our SCOM Daily cube is.

c.        Next we configured which account for the Impersonation Information.

d.       Next is where we will put our first MDX Query from Step 2 above.

                                                                                       i.      As you can see below with our SCOM example:


3.       We then clicked Finish to import the data.

4.       Now in the model designer we renamed our columns so that they would be more descriptive. As Shown with our example below:

a.        clip_image008

b.       We also created a measure called:

                                                                                       i.      Percentage Available Mbytes

                                                                                      ii.      NOTE: The reason that we use this is so that we can use it in our Power View Report later

c.        And we also formatted the Full Date column with the Short Date.

5.       Next we are going to use our second query from step 3 above for the detailed information.

a.        As with our example this is the actual information from SCOM.

b.       We once again renamed the columns and formatted them as per our requirements.

c.        clip_image010

d.       We also created a measure called:

                                                                                       i.      Available Mbytes

                                                                                      ii.      NOTE: The reason that we use this is so that we can use it in our Power View Report later

e.       And we also formatted the Full Date column with the Short Date

6.       Now the crucial step here is where we created the relationship between the two queries.

a.        NOTE: The reason that we do this is so that later when creating the Power View Report you can then select the server name and it will show you the history on the same chart.

b.       In SSDT click on the Diagram View

                                                                                       i.      clip_image012

c.        As with our example we had to create a relationship within our SCOM Data.

d.       Then create the relationship between the Display Names in the two queries as shown below

                                                                                       i.      clip_image014

e.       NOTE: This will work because in our Sheet called “SCOM_Available_Memory_Overview” there is only unique Display Names.

7.       Now you would deploy your SSAS Tabular model to your Server.


Power View Report

Now what we are going to do is to create our Power View Report based on our SSAS Tabular Project.


1.       All the steps below are based on our SCOM Example, which you could incorporate into your solution.

2.       Log into your SharePoint Server and create your connection file to your SSAS Tabular database.

3.       Then click on your connection file to create your Power View Report.

4.       First we configured it from the SCOM_Available_Memory_Overview

a.        clip_image016

b.       Next we configured the overview for the Available Mbytes with the following in the created as a bar chart.

c.        clip_image018

d.       Once the bar chart has been configured we then change the sorting to be the following below:

                                                                                       i.      clip_image020

e.       NOTE: The reason for this is because you want to see the systems that have had the largest drop at the top.

f.         Once completed it looked like the following:


5.       Next we create the detailed view below using the SCOM_Available_Memory_Detailed

a.        clip_image024

b.       Next we configured this to use the Line Chart

                                                                                       i.      clip_image026

c.        This is what it will look like once completed.


6.       Finally the trick that makes this work so well is that if you click on the bar chart at the top it will then only show the relevant line chart with the details below.

a.        So for example if you clicked on the you can see that it has the largest Memory drop as shown below.


b.       The thing to noteis that SQLServer.Domain.COM on the first chart looks like there was not a large drop in memory. But if you click on the bar chart for SQLServer.Domain.COM you will see that the detailed view now shows a much more meaningful picture.

                                                                                       i.      NOTE: Look at the Orange line in step 5 above.


7.       Now Save your Power View Chart.

8.       And finally if required create an SSIS Package to refresh your tabular data as required.