BI-NSIGHT – Power BI (Desktop Update, Service Updates, API Updates, Mobile App Update, Visual Contest Results, Content Pack – Stripe) – Office 2016 (Excel Updates, Power Query Update, Excel Predictions) – SQL Server Analysis Service 2012 Tabular Update – SQL Server Analysis Services 2016 Extended Events

So this week there was once again a whole host of updates with Power BI, as well as finally the official release of Office 2016.

It sure is a busy time be in Business Intelligence space, especially in the Microsoft space.

So let’s get into it…

Power BI – Desktop Update

So I woke up this morning to see that there has been a massive release in the Power BI Desktop application. I immediately downloaded and installed the update. I have already used some of the features today.

And there was a whole host of updates, too many to go through all of them here, but I would just like to highlight the ones which I think are really great additional features.

I am really enjoying the Report Authoring features, and I have mentioned it before but the drill up and drill down features are really great and allows for more details to be in the report which you will not initially see on face value.

Then under the data modelling section I have to say that I am currently not any DAX guru, but I do appreciate how powerful it is, and how you can really extend your data with so many DAX functions. In particular is the Calculated Table, which Chris Webb has already blogged about and has some great information here: Calculated Tables In Power BI

And there are some great new features with regards to Data Connectivity as well as Data Transformations & Query Editor improvements, which all forms part of Power Query. Which once again enables the author of the reports to enrich the data, which in turn will create great visualizations.

You can find out all about all 44 updates here: 44 New Features in the Power BI Desktop September Update

Power BI – Service Updates

So yet another great update on the Power BI platform.

I think that finally being able to customize the size of the tiles is really good. So that you can fit more meaningful information on your dashboard.

Another great service update is to be able to Share Read Only dashboards with other users. This is great because often you create dashboard and reports, which you want to share with users and let them interact with your data, but not make any changes.

As well as having more sample content packs which will be a great way to showcase how powerful Power BI is.

You can read about it and all other updates here: Power BI Weekly Service Update

Power BI – API Updates

There is no pretty picture for the API updates, but there are some great new features.

The one that I think is really good is the ability to be able to embed a Power BI tile into an application. So at least it gives you the ability to have the great features of Power BI in your application without having to go directly to Power BI.

I also see that there is the ability from what I can see to pass some filters or parameters into the Power BI report via the URL which is really good and can prove to extend the functionality.

These are the details in the API updates

  • Imports API
  • Dashboards API
  • Tiles and Tile API
  • Groups API
  • Integrating Tiles into Applications
  • Filtering Tiles integrated into your Application

You can find out all about it here: Power BI API updates roundup

Power BI – Another Mobile Update

The Microsoft team must be working 24 hours a day with the amounts of updates and additions that are coming out from Microsoft.

They have released some additional updates into the Mobile application which are great, as we all are well aware that having a mobile application really can help showcase your solution. As well as ensure that it gets to the right users and that they can see the related information.

You can find out about the updates to the IOS, Windows and Android details here: Power BI mobile Mid-September updates are here

Power BI – Visual Contest Results – People’s Choice Awards

As you can see above this is the first people’s choice award for the Visual Contest result, which I can see myself using that with my existing data.

You can find out about it and the other entries here: Power BI Best Visual Contest – 1st People’s Choice Award!

Power BI – Content Pack Stripe

Once again this was another great content pack update this week.

There are a whole host of people who are using the Stripe platform payment for their online business. From the really small guys to the big guys. And this gives everyone a really great and easy way to understand and visualize your data. As well as what payments you are getting.

You can find out more about it here: Monitor and Explore your Stripe data in Power BI

Office 2016

I am very happy to see that Office 2016, has been released before the actual year of 2016.

I have mainly been focused on all the features in Excel, due to being in BI. But there are a whole host of updates, fixes and new additions in Office 2016.

You can read all about it and all the details here: The new Office is here

Excel 2016 – Features

With Office 2016 being released this blog post from the Office team shows a lot of the great features that are available in Excel 2016.

It does showcase a lot of great features focused on Business Analysts, as well as how people can leverage all the new features in Excel 2016.

You can read up all about it here: New ways to get the Excel business analytics features you need

Power Query Update

With so many things going on within Power BI, there has been another great release with Power Query.

As you can see with the picture above it is great to finally have the ability to write your own custom MDX or DAX query to get the data which you require from your SSAS source.

Another great feature is the ability to extract a query from one of the steps within your current Power Query query, and then you can use this in another Power Query window. As they say it gives you the ability really easily use the same code over again, without having to do it all over again.

You can read all about it here: Power Query for Excel September 2015 update

Excel Future Prediction

I recently came across a really interesting article from some of the Industry experts within the BI space, and for them to predict how they see Excel’s use as well as where they see it fitting into the BI space in the upcoming years.

There were some really insightful and interesting details, which made me think about how Excel has evolved over the years, and with the current additional investments going into Excel, how this is going to be leveraged and improved in the years to come.

You can read all about their thoughts here: 27 MICROSOFT EXCEL EXPERTS PREDICT THE FUTURE OF EXCEL IN BUSINESS INTELLIGENCE

SQL Server Analysis Service 2012 – Tabular Update

There has been a CU update for SQL Server 2012, and one of the great updates relates to SSAS Tabular for columns that have a high cardinality. Which was a performance issue before this release.

It is great to see that this has been addressed, especially due to the fact that in SSAS Tabular there will be cases when columns will have a high cardinality. And even though it is often super quick, we would like everything to be as fast as possible.

You can read all about the updates to SQL Server CU 8 here: Cumulative update package 8 for SQL Server 2012

SQL Server Analysis Services 2016 – Extended Events

I think that it is great to see that we are finally getting some additional features and updates to Analysis Services.

When I read up about the extended events, this is something really great to see. I actually have been in an exercise to monitor what has been going on our SSAS instance both in terms of performance, as well as which users are accessing the cubes and what they are doing. And currently there is not a super elegant solution to achieve this.

With the extended events this makes it a lot easier and gives you the ability to quickly get the information that you require.

I also love it that you are able to have a live query, which you can use to see if you are specifically running something. As well as if you want to ensure that you are capturing the right events.

This is definitely something that I will be looking to use when we finally can install and use SQL 2016.

You can find out about all the details here: Using Extended Events with SQL Server Analysis Services 2016 CTP 2.3

BI-NSIGHT – SQL Server 2016 – Power BI Updates – Microsoft Azure Stack

Well I have to admit that it seems that the Microsoft machine has been working flat out to get out new products and updates.

If my memory serves me, this is the third week in a row that Microsoft has released new products and updates. I am really enjoying it! But hopefully this will slow down, so that we can catch our breath and actually play with some of the new products and features.

So let’s get into it. There is quite a lot to go over!!

SQL Server 2016

So with Microsoft Ignite happening this week, the wonderful guys from Microsoft have started to announce what we can expect to be in the next version of SQL Server.

I am going to focus mainly on the BI (Business Intelligence) features, and there are quite a few! Some of what I am detailing below I have only seen pictures on Twitter, or I have read about it. As well as the preview not even being released yet, I am sure that there will be some changes down the line.

SQL Server Reporting Services

It finally appears that Microsoft has been listening to our cries and requests for updates in SSRS (SQL Server Reporting Services)!

Built-in R Analytics

I think that this is really amazing, even though I am not a data scientist, I think it is a smart move my Microsoft to include this. What this means in my mind is that you can now get the data scientists to interact and test their R scripts against the data as it sits in the transactional environment. And from there, this could then be used to create amazing possibilities within SSRS.

Power Query included in SSRS

From what I have seen people tweeting about as well as what I have read, it would appear that Power Query will be included in SSRS. This is fantastic and will mean now that virtually any data source can be consumed into SSRS.

New Parameter Panel, chart types and design

I am sure we can all agree, that SSRS has required an overhaul for some time. And it seems that finally we are going to get this. It would appear that the parameters panel is going to be updated. I do hope that it will be more interactive and in a way react similar to the way the slicers do in Excel. As well as getting new chart types. Here again I am going to assume that it will be similar to what we have in Power BI!

And finally there was also mention that the reports will be rendered quicker, as well as having a better design. I also am hoping that they will deploy this using HTML 5, so that it can then be viewed natively on any device. It is going to be interesting to see what Microsoft will incorporate from their acquisition of Datazen.

SQL Server Engine

Built-in PolyBase

Once again, this is an amazing feature which I think has boundless potential. By putting this into the SQL Server Engine this means that it is a whole lot simpler to query unstructured data. Using TSQL to query this data means that for a lot of people who have invested time and effort into SQL Server, now can leverage this using PolyBase. And along with this, you do not have to extract the data from Hadoop or another format, into a table to query it. You can query it directly and then insert the rows into a table. Which means development time is that much quicker.

Real-time Operational Analytics & In-Memory OLTP

Once again the guys at Microsoft have been able to leverage off their existing findings with regards to In Memory OLTP and the column store index. And they have mentioned in their testing that this amounts to 30x improvement with In-memory OLTP as well as up to 100x for In-Memory Column store. This is really amazing and makes everything run that much quicker.

On a side note, I did read this article today with regards to SAP: 85% of SAP licensees uncommitted to new cloud-based S/4HANA

I do find this very interesting if you read the article. What it mentions is that firstly 85% of current SAP customers will not likely deploy to the new S/4HAHA cloud platform. Which in itself does not tend well for SAP.

But what I found very interesting is that to make the change would require companies to almost start again for this implementation. In any business where time is money, this is a significant investment.

When I compare this to what Microsoft has done with the In-Memory tables and column store indexes, where they can be used interchangeably, as well as there is some additional work required. On the whole it is quick and easy to make the changes. Then you couple this with what Microsoft has been doing with Microsoft Azure and it makes it so easy to make the smart choice!

SSAS (SQL Server Analysis Services)

I am happy to say that at least SSAS is getting some attention to! There were not a lot of details but what I did read is that SSAS will be getting an upgrade in Performance usability and scalability.

I am also hoping that there will be some additional functionality in both SSAS OLAP and Tabular.

SSIS (SQL Server Integration Services)

Within SSIS, there are also some new features, namely they are also going to be integrating Power Query into SSIS. This is once again wonderful news, as it means now that SSIS can also get data from virtually any source!

Power BI

Once again the guys within the Power BI team have been really busy and below is what I have seen and read about in terms of what has been happening within Power BI

Office 365 Content Pack

I would say that there are a lot of businesses that are using Office 365 in some form or other. So it makes perfect sense for Microsoft to release a content pack for Office 365 Administration

As you can see from the screenshot below, it gives a quick overview on the dashboard to see what activity is happening. As well as details of other services. I am sure that this will make a quick overview of your Office 365 systems really easy to see. And also if there are any potential issues, this could also be highlighted!

Visual Studio Online Content Pack

Another content pack that is about to be released is for people who use Visual Studio Online, I personally do not currently use this. But it does look great for people to once again have a great overview of what is going on.

You can read more about it here: Gain understanding and insights into projects in Visual Studio Online with Power BI

And as you can see below, what you can view once you have got it setup within Power BI.

Power BI planned updates

Below are the updates that I had previously voted for in Power BI. It is great to see that the Microsoft team is actively listening to their customers and implementing some of the idea’s. I have to say that I do not think that there are many other software companies that are doing this currently. And also being able to roll it out as quickly as Microsoft is.

Set Colors and Conditional Formatting in visuals

  • This is great as it will allow the report authors to have more control in terms of how their reports look.

Undo / Redo button in browser & designer

  • This might seem like a small update, but I know personally from working with the Power BI reports, that sometimes you just want to see what a different report looks like. Or adding another element. And with the Undo / Redo buttons, it just saves that little bit of time, as well as to make the report authoring experience that much more enjoyable.

Power BI Announcements from Microsoft Ignite

Below are some announcements that I have read up about either on Twitter or one of the blogs that I follow. It is really great to see so many things in the pipeline.

This means that there is a lot to look forward to, as well as ensuring that we have new and wonderful things to show.

I got this picture via Twitter, which someone must have taken at the Microsoft Ignite Conference. As you can see it is not very clear, but it does show the next update in the Power BI Designer.

You can also see the undo and redo buttons.

It also appears that in the Power BI service, there will be support for SSRS files, namely the .rdl files! Here is another picture taken from Microsoft Ignite.

 

Then there is the Many to Many relationships and bi directional cross filtering will be supported in SQL Server 2016 tabular models, which I am sure will also be included in the new Power BI backend. As this is where it stored all the data.

 

For Hybrid BI, there will be support for live querying for SSAS, currently this is already in place for SSAS Tabular.

 

It also looks like there will be a scheduled refresh for SQL Server Databases as a source. Which is great for people who either do not have either of the SSAS cubes, or want to get some of their data into Power BI.

Microsoft Azure Stack

While this is not exactly BI, it is related to BI, in that with Azure Stack you can get the Azure functionality on your own hardware which is fantastic. And I am sure for a lot of businesses this will be welcomed.

You can read more about it here: Microsoft Brings the Next Generation of Hybrid Cloud – Azure to Your Datacenter

 

SSIS – Using SQL Server Analysis Services (SSAS) MDX query into Variable in Execute SQL Task

What I wanted to do was to get the output from my MDX query, and then use this in SSIS and take the output and put it into an SSIS Variable which could then be used later in your SSIS Package.

 

Example:

·         We are going to get the Internet Sales Amounts for 2 days.

o    01 June 2008 – $44,650.70

o    02 June 2008 – $35,496.03

·         Then we are going to put the two values into a variable.

·         Then we are going to use a Precedence constraint to compare the values, and if the Internet Sales amount is lower when compared then continue onto the next step.

o    So if the value on 02 June 2008 is lower than the value on 01 June 2008 continue to the next step.

o    So as from the above values it should continue onto the next step.

o    If the amount is higher then do not do anything.

·         NOTE: The reason that we are using an MDX query is so that we can get our data back as quickly as possible.

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

1.       The first thing that we are going to do below is get our MDX query which connects to the AdventureWorksDW2012 SSAS Cube.

a.        This query is for Internet Sales for 01 June 2008

Selectnonempty {[Measures].[Internet Sales Amount]} on 0

,nonempty {(

                     [Date].[Date].&[20080601]

                     )} on 1

from [Adventure Works]

                                                               i.       

b.       This query is for Internet Sales for 02 June 2008

Selectnonempty {[Measures].[Internet Sales Amount]} on 0

,nonempty {(

                     [Date].[Date].&[20080602]

                     )} on 1

from [Adventure Works]

                                                               i.       

2.       Now create your new SSIS Package.

3.       Next we will create our 2 variables that will be used to populate using the MDX queries above.

4.       Below is how we have created them:

a.        clip_image002[1]

b.       NOTE: We have created them as Int64 due to us getting back a numeric value.

                                                               i.      And this can also be compared in our Precedence Constraint

5.       Next we need to create an OLE DB Connection to our SSAS Cube by doing the following below:

a.        Right click in your Connection Managers section and select New OLE DB Connection

b.       When the window opens click on New

                                                               i.      Now from the drop down select the following:

1.       clip_image004[1]

2.       NOTE: You have to select the above so that we can connect to our SSAS Instance.

                                                              ii.      Now once configured you will see the following below:

1.       clip_image006[1]

                                                            iii.      If you want you can click on Test Connection to ensure that it can connect.

                                                            iv.      Then click Ok and then Ok again.

c.        You should now see the following in your Connection Managers Window:

                                                               i.      clip_image008[1]

                                                              ii.      NOTE: If required you can make this a Project Connection if your SSIS Project is in Project Deployment mode.

6.       Now what we are going to do is to assign our MDX Query to our variables using the Execute SQL Task.

7.       Drag in an Execute SQL Task and complete the following steps for One Day back

a.        As per our example I renamed the Execute SQL Task to the following:

Get Internet Sales Amount for One Day Back into Variable

b.       I went into the Execute SQL Task Properties and configured it with the following:

                                                               i.      clip_image010[1]

c.        We then selected our OLE DB connection that we created in step 5 above:

                                                               i.      clip_image012[1]

d.       Next under SQLStatement we are then using our MDX query from Step 1a above:

                                                               i.      clip_image014[1]

                                                              ii.      Then click Ok.

e.       Next click on the Result Set and click on Add

                                                               i.      Now where it says Result Name change this to the following:

[Measures].[Internet Sales Amount]

                                                              ii.      And then ensure that the Variable Name is:

User::InternetSales_OneDayBack

                                                            iii.      clip_image016[1]

f.         Then click Ok.

g.        Now right click and select Execute Task to test that the variable will be populated.

h.       clip_image018[1]

8.       Drag in an Execute SQL Task and complete the following steps for Two Days back

a.        As per our example I renamed the Execute SQL Task to the following:

Get Internet Sales Amount for Two Days Back into Variable

b.       I went into the Execute SQL Task Properties and configured it with the following:

                                                               i.      clip_image010[2]

c.        We then selected our OLE DB connection that we created in step 5 above:

                                                               i.      clip_image019[1]

d.       Next under SQLStatement we are then using our MDX query from Step 1b above:

                                                               i.      clip_image021[1]

                                                              ii.      Then click Ok.

e.       Next click on the Result Set and click on Add

                                                               i.      Now where it says Result Name change this to the following:

[Measures].[Internet Sales Amount]

                                                              ii.      And then ensure that the Variable Name is:

User::InternetSales_TwoDaysBack

                                                            iii.      clip_image023[1]

f.         Then click Ok.

g.        Now right click and select Execute Task to test that the variable will be populated.

h.       clip_image025[1]

9.       Now the final step is to have a next step to go to if the value is lower.

10.    For our example we are going to put the next step as a Send Mail Task

a.        So we dragged in the Send Mail Task and configured it so that it can send an email if run.

b.       Next we dragged the success precedence constraints as shown below:

c.        clip_image027[1]

d.       NOTE: The reason it was done in this order is so that we can get both variables populated before we change the constraint options.

e.       Now double click on the success precedence constraint that goes from Get Internet Sales Amount for Two Days Back into Variable to the Send Mail Task

                                                               i.      clip_image029[1]

f.         Now in the Precedence Constraint Editor Window complete the following:

                                                               i.      Change the Evaluation operation from Constraint to Expression

                                                              ii.      clip_image031[1]

g.        Then where it says Expression click on the Ellipses button.

h.       Now in order to check if the value is less to continue we put the following:

@[User::InternetSales_OneDayBack] <  @[User::InternetSales_TwoDaysBack]

                                                               i.      Click on Evaluate Expression to ensure that it is valid.

                                                              ii.      clip_image033[1]

i.         Then Click Ok and Ok again to go back to the Control Flow

11.    Now finally run your SSIS Package and it should complete with the following as shown below:

12.    clip_image035[1]

13.    NOTE: If you had to change your Precedence Constraint to the following below and re-run the above in order to test if the value was higher you would get the following result:

a.        clip_image037[1]

 

Book Review – Expert Cube Development with SSAS Multidimensional Models

http://www.packtpub.com/expert-cube-development-with-ssas-multidimensional-models/book

This book is a must read if you are looking for some useful and practical information in your SSAS Multidimensional models. But it must be noted that this is not an introduction into SSAS Multidimensional modelling, rather practical real world examples.

I found that the book is a wealth of information that can be applied to your working environment. There are a lot of real world examples that I have had a look at and made me check and update my SSAS Multidimensional models so that they can be that much faster and quicker.

It is also a great reference book, for when you are looking for a specific issue, where you can find the requirement and how to solve your particular issue.

I also really enjoyed the book, because it got straight into the requirement and then explained how to potentially solve or improve your requirement. Along with this I found that it was great where they applied their working experience in what they encountered and how they overcome an issue.

It is written by the guys who I consider to be some of the leaders in SSAS Multidimensional modelling. And you can see by the context of the detail in the book, as well as their examples that this happens often in the working environment, and how to get the best performance from your SSAS Multidimensional models.

A great read and I would recommend this book to anyone who is looking for a wealth of information as well as some great insight into the inner workings of SSAS Multidimensional models.

Excel – User Input for Custom MDX query using VBA and Pivots

I had a requirement where I wanted to do the following for a particular Power User

·         Give them the ability to put in a Date as a parameter in an Excel Workbook.

·         Once I have this date, is to the have a custom MDX query, which will then go and get the required data for the date.

o    NOTE: The reason for the custom MDX query was due to the volume of data returned I wanted to limit the data and query, thereby making the results return as quickly as possible.

·         This must then be queried and returned to Excel, where the results could then be pivoted.

·         After which the user could then create his Pivot and chart based on his date.

·         And essentially for the user to just input his date and click a button.

 

Working Example

·         I wanted the user to put in his Date (01 June 2008) which will then return all the Products and Product Lines for that date inputted.

·         The outcome will be in the same sheet once the user has clicked on the button Click to Refresh clip_image002[4] their Pivot and Chart will then be updated.

 

NOTE:

·         VBA is new to me, so I am aware that there is no error checking in place.

·         I also do not validated that the date is in the correct format for the user input. I am going to assume that I have explained to my user what date format to use.

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Creating the VBA Function within Excel

The first part is to hard code the VBA function to ensure that we can get our data returned correctly to our Sheet in Excel.

 

NOTE: You might have to enable the Developer Ribbon, which you can reference here: http://office.microsoft.com/en-za/excel-help/show-the-developer-tab-HA101819080.aspx

 

1.       Click on the Developer Ribbon and then click on Macros

a.        clip_image004[4]

2.       This will then open the Macro Window

3.       Where it says Macro Name, put in a name for the Macro you are going to create.

a.        As with our Example I put the following Macro Name:

b.       clip_image006[4]

c.        Then click on the Create Button

4.       This then opens the Microsoft Visual Basic for Applications Window

5.       You will now see on the left hand side under Modules, it has Module 1

a.        clip_image008[4]

6.       On the right hand side is where we will be putting in our VBA code in our AW_ProductsByDate

a.        clip_image010[4]

7.       The next that you HAVE to do is to add the Microsoft ActiveX Data Objects 2.8 Library to your VBA function otherwise it will not recognize some of the coding later.

a.        To do this click on Tools and then Reference

                                                               i.      clip_image012[4]

b.       Once this opens the References – VBAProject, you are going to have to scroll down quite a bit until you find the following:

                                                               i.      clip_image014[4]

                                                              ii.      As shown above put a tick next to Microsoft ActiveX Data Objects 2.8 Library

                                                            iii.      Then click Ok.

c.        Now if you want you can save your workbook.

d.       NOTE: when saving your workbook you have to save it as an Excel Macro-Enabled Workbook as shown below. If this is not done then your Macro will not run.

                                                               i.      clip_image016[4]

8.       Now in the section below I will explain each section, while at the end I will have the entire VBA function so that you can copy and paste it.

9.       In the first section is where we are actually just declaring our Variables to use later.

‘Declare variables

Set objMyConn = New ADODB.Connection

Set objMyCmd = New ADODB.Command

Set objMyRecordset = New ADODB.Recordset

Dim Input1 As String

‘Input1 = ActiveWorkbook.Sheets(“Sheet1”).Range(“A2”)

a.        What we are doing above is setting our connections.

b.       Then where we have the DIM Input1 this is where we are creating our input variable

                                                               i.      NOTE: This has been commented out and will be explained in the later steps how to get it working.

                                                              ii.      Dim Input1 As String

                                                            iii.      Then we are setting the location of our Input1 variable

ActiveWorkbook.Sheets(“Sheet1”).Range(“A2”)

1.       NOTE: The above means on our Active Workbook, on the sheet namedSheet1look in the columnA2

                                                            iv.      We will only be using the input later once we have got everything configured.

10.    In the next section is where we open and create our connection to our SQL Server Analysis Services Database

‘Open Connection

objMyConn.ConnectionString = “Provider=MSOLAP.5;Integrated Security=SSPI;Persist Security Info=True;Initial Catalog= AdventureWorksDW2012;Data Source=ServerName\OLAP;MDX Compatibility=1;Safety Options=2;Protocol Format=XML;MDX Missing Member Mode=Error;Packet Size=32767″

objMyConn.Open

a.        NOTE: In the above connection string you will have to modify the following for your environment.

b.       For Initial Catalog highlighted in RED you will have to change this to the name of your SSAS Database.

                                                               i.      In our example it was:

1.       Initial Catalog= AdventureWorksDW2012

2.       NOTE: Ensure to get the Database ID, and not the name to ensure that you can connect to the database correctly.

c.        For the Data Source, this is your Server name, and in our example it is highlighted in BLUE

                                                               i.      Data Source=ServerName\OLAP

d.       The other settings are optional settings for if data is going across a slower network and to potentially speed up queries.

11.    In the next section is where we actually put our MDX query

‘Set and Excecute SQL Command

Set objMyCmd.ActiveConnection = objMyConn

objMyCmd.CommandText = “Select ‘ {[Measures].[Internet Sales Amount]} on 0 ” & _

                        “,’ {( ” & _

                                    “[Date].[Date].&[20080601], ” & _

                                    “[Product].[Product Line].Children, ” & _

                                    “[Product].[Product].Children ” & _

                                    “)} on 1” & _

                        “from [Adventure Works]”

objMyCmd.CommandType = adCmdText

objMyCmd.Execute

a.        As you can see above the section that you need to change is after the objMyCmd.CommandText

b.       A thing to note is that you have to start your query with the double quotes

                                                               i.     

c.        And if you are going to continue it on another line and not have it all on one like then at the end of each line you need to put the following:

                                                               i.      ” & _A

d.       With our query above you can see for now that we have hardcoded our query.

e.       NOTE: We will come back later and update it to get the details from the Input1 variable we created in step 9 above.

f.         The rest of the syntax just executes the MDX query.

12.    The next section just opens the recordset

‘Open Recordset

Set objMyRecordset.ActiveConnection = objMyConn

objMyRecordset.Open objMyCmd

13.    Now in this next section we are going to copy the data to Excel

‘Copy Data to Excel

ActiveWorkbook.Sheets(“Dataset”).Range(“A2”).CopyFromRecordset (objMyRecordset)

‘ActiveWorkbook.Sheets(“Dataset1”).Range(“DownTimeTable”).CopyFromRecordset (objMyRecordset)

a.        As you can see above what we are doing is to copy from the recordset to a sheet called “Dataset” and then insert it starting at “A2

b.       NOTE: You will see that there is a section commented out below. We will come back and change this in further steps.

c.        What we now need to do is to create our sheet that is referenced above calledDataSet

d.       If you click on your Excel Workbook, and the plus sign next to Sheet 1, it will create Sheet 2

                                                               i.      clip_image018[4]

e.       Right click on Sheet2 and select Rename

                                                               i.      Rename it to DataSet.

f.         Now we have created our Sheet which is referenced above.

14.    Now in the next section is where we will refresh our Power Pivot model which we will create in further steps.

‘This is to refresh the Power Pivot Model

ActiveWorkbook.RefreshAll

15.    The final part is to close the connection and end our code.

‘Close Connection

objMyConn.Close

End Sub

16.    Here is the entire code all in one block.

a.        NOTE: We will explain the Delete section in step 21 below

Sub AW_ProductsByDate()

‘Delete all data out of Table before inserting new data

With Sheet2.ListObjects(“ProductsByDate”)

        If Not .DataBodyRange Is Nothing Then

            .DataBodyRange.ClearContents

            .DataBodyRange.Delete

        End If

    End With

 

‘Declare variables

Set objMyConn = New ADODB.Connection

Set objMyCmd = New ADODB.Command

Set objMyRecordset = New ADODB.Recordset

‘Dim Input1 As String

‘Input1 = ActiveWorkbook.Sheets(“Sheet1”).Range(“A2”)

 

‘Open Connection

objMyConn.ConnectionString = “Provider=MSOLAP.5;Integrated Security=SSPI;Persist Security Info=True;Initial Catalog=AdventureWorksDW2012;Data Source=ServerName\OLAP;MDX Compatibility=1;Safety Options=2;Protocol Format=XML;MDX Missing Member Mode=Error;Packet Size=32767”

objMyConn.Open

 

‘Set and Excecute SQL Command

Set objMyCmd.ActiveConnection = objMyConn

objMyCmd.CommandText = “Select ‘ {[Measures].[Internet Sales Amount]} on 0 ” & _

                        “,’ {( ” & _

                                    “[Date].[Date].&[20080601], ” & _

                                    “[Product].[Product Line].Children, ” & _

                                    “[Product].[Product].Children ” & _

                                    “)} on 1” & _

                        “from [Adventure Works]”

objMyCmd.CommandType = adCmdText

objMyCmd.Execute

 

‘Open Recordset

Set objMyRecordset.ActiveConnection = objMyConn

objMyRecordset.Open objMyCmd

 

‘Copy Data to Excel

ActiveWorkbook.Sheets(“Dataset”).Range(“A2”).CopyFromRecordset (objMyRecordset)

‘ActiveWorkbook.Sheets(“Dataset”).Range(“ProductsByDate”).CopyFromRecordset (objMyRecordset)

 

‘This is to refresh the Power Pivot Model

ActiveWorkbook.RefreshAll

 

‘Close Connection

objMyConn.Close

End Sub

17.    Now if you click on the Run sub/UserForm as shown below your query should now run.

a.        clip_image020[6]

b.       Then on your Dataset Sheet you should see the output of your query.

c.        NOTE: If the query only returns one row, ensure that your query is not set to only show non empty results.

d.       clip_image022[4]

18.    In the next steps we are going to give our results column names and convert it to a table.

19.    So on the sheetDataset” at the top we put in the following column names:

a.        clip_image024[4]

b.       Then what we did is to select the entire range and clicked on Format as Table and formatted it as a table.

                                                               i.      clip_image026[4]

                                                              ii.      NOTE: Ensure to select My Table has headers

1.       clip_image028[4]

                                                            iii.      Click Ok

c.        You should now see this below:

d.       clip_image030[4]

e.       The final thing is on the top left hand side it will have Table Name, change this from Table1 to ProductsByDate

                                                               i.      clip_image032[4]

20.    Now the final step is to go back into your VBA Macro and change the following code:

‘Copy Data to Excel

‘ActiveWorkbook.Sheets(“Dataset”).Range(“A2”).CopyFromRecordset (objMyRecordset)

ActiveWorkbook.Sheets(“Dataset”).Range(“ProductsByDate”).CopyFromRecordset (objMyRecordset)

a.        You can see from above that we have now changed our dataset that is returned to go into our Table name from step 19 above.

                                                               i.      NOTE: This is so that when the number of rows changes we will always get back all the rows.

21.    What we are doing in the last part is to delete all the data in our table each time it runs.

a.        This is to ensure that there is no left over or incorrect data

b.       This will go at the top of your codeso that it will run first

‘Delete all data out of Table before inserting new data

With Sheet2.ListObjects(“ProductsByDate”)

        If Not .DataBodyRange Is Nothing Then

            .DataBodyRange.ClearContents

            .DataBodyRange.Delete

        End If

    End With

 

Taking the output from our VBA function creating our Pivots with a Chart.

In the next section we are going to take the output from our VBA function which was inserted into a table and then create our Pivot and Pivot Chart on our First sheet.

 

1.       Click on your table that you created in your sheet called “Dataset”

2.       Then click on the Insert ribbon, and then click on Pivot Table

a.        clip_image034[4]

3.       This will then open the Create Pivot Table Window, which you will configure with the following:

a.        Where it says Choose the data that you want to analyze leave the default which is shown below:

                                                               i.      clip_image036[4]

b.       Then where it says Choose where you want the PivotTable report to be placed, click on Existing Worksheet

                                                               i.      Then were it says location click on the box to select your location.

                                                              ii.      Now click on your Sheet 1, and click on Cell A6

                                                            iii.      So it will look like the following:

1.       clip_image038[4]

c.        Finally where it says choose whether you want to analyze multiple tables, click the tick box which is shown below:

                                                               i.      clip_image040[4]

                                                              ii.      NOTE: The reason that we add it to the Data Model is so that it actually puts in into Power Pivot.

1.       You can validate it, if you had to click on your Power Pivot ribbon and click on Manage you would see your data.

d.       So this is what the entire Window looks like:

                                                               i.      clip_image042[4]

e.       Then click Ok.

4.       Now we can create our Pivot and our Pivot chart.

5.       I created the following below:

a.        clip_image044[4]

b.       NOTE: While this does look quite simple it is used for our example.

c.        And the actual Pivot looks like this:

d.       clip_image046[4]

 

Creating the variable and button for the users input

Now in the final part what we are going to do is the following:

·         Define where we will get the users data from.

o    Convert this to the correct format which we need for our MDX query

·         Update our VBA code so that we can use this within our code dynamically.

·         Put in a button which will refresh everything based on the user input.

 

1.       Here we will first create our column and format it as required from the users Input

a.        So what we did is to put in some text so that the user knows what to do

b.       clip_image048[4]

c.        Next we put in a date in A2

                                                               i.      EG: 2008-06-04

                                                              ii.      NOTE: What we need to do though is to ensure that we format the A2 column as Text.

1.       The reason for this is we are going to substitute the date to the correct format which we require for our MDX query

                                                            iii.      clip_image050[4]

d.       Next in column B2 I put the following:

=SUBSTITUTE(A2;”-“;””)

                                                               i.      And you will then see that the column now has the following

                                                              ii.      clip_image052[4]

                                                            iii.      Now what I did is I moved the Chart from above to the side.

                                                            iv.      I then cut and pasted the above excel function and put it in cell D11

                                                              v.      clip_image054[4]

                                                            vi.      NOTE: The reason that I did this is so that the cell value would be hidden from the user. But we did NOT have to hide a cell.

e.       I then moved the chart and slicer back to where it belongs.

2.       Next we are going to go back into our VBA Code and update the Variable so that it will get the data from cell D11

a.        So you should still have your VBA window open.

b.       Under the Declare Variables section uncomment the two lines so that they look like the following below:

Dim Input1 As String

Input1 = ActiveWorkbook.Sheets(“Sheet2”).Range(“A2”)

c.        Now we are going to change the location of where the cell is from the above details to the following:

Input1 = ActiveWorkbook.Sheets(“Sheet1”).Range(“D11”)

d.       The next thing that we need to do is to change our MDX query from being hard coded to use the variable Input1

e.       So under the section ‘Set and Execute SQL command, go to the following line:

“[Date].[Date].&[20080604], ” & _

f.         Now we are going to change it to the following below:

“[Date].[Date].&[” & Input1 & “], ” & _

                                                               i.      What we have done above is in order to use the variable we have to encapsulate it with the following:

“& VariableName &”

c.        Now what you can do to test this is to click on the Run sub/UserForm as shown below your query should now run.

                                                              ii.      clip_image020[7]

g.        Now it should run through correctly and you should see your data updated in Sheet1.

3.       Now the final step is to add the button, which the user can press after they have put in their date.

a.        Go back into your Excel spreadsheet and onto Sheet1

b.       Click on the Developer Tab

c.        Then under the Controls section, click on Insert and then select Button

                                                               i.      clip_image056[4]

d.       Next you will need to drag your button where you want it to be.

                                                               i.      You could do this anywhere initially.

e.       After you have dragged in your button it will then open up the Assign Macro Window.

                                                               i.      From this Window select your VBA that you created.

                                                              ii.      clip_image058[4]

f.         Then click Ok.

g.        You will now see the following:

h.       clip_image060[4]

i.         Now in order to change the text on the button do the following:

                                                               i.      Right click on the Button 1 and select Edit Text

                                                              ii.      clip_image062[4]

j.         I changed it to the following:

Click to refresh data

                                                               i.      NOTE: I had to drag the button to make the text fit.

1.       Also whilst you are editing the text you could also change the font and size.

k.        I then moved it so it looked like the following:

l.         clip_image064[4]

4.       Now to finally test that it works I changed the date to:

a.        2008-06-05 and then clicked the button “Click to refresh data”

b.       Now after the data refreshed I saw my pivot change to the following:

c.        clip_image066[4]

 

Now you can save your Excel worksheet and send it to the required people

NOTE: Ensure that the users have the required access to the cube where the MDX will be running.

 

You can get a copy of the workbook here: http://1drv.ms/1dfKry4 

SSAS – Using the SUM Function within a Measure Group to display a Distinct Count with SSAS (SQL Server Analysis Services)

Whilst completing my review on the Expert Cube Development with SSAS Multidimensional Models, I came across the Distinct Count Measure within SSAS and how this can affect query performance as well as processing performance.

 

You can find the reference on page 114

And here is a link to the actual book that I am reviewing:

http://www.packtpub.com/expert-cube-development-with-ssas-multidimensional-models/book

What I wanted to do is instead of using the Distinct Count Function which we can use within SQL Server Analysis Services (SSAS), (which as we know has some performance issues along with creating its own measure group), I wanted to find a way where I could use the SUM Function in our Measure group but still return the distinct count.

 

And then use this distinct count using a dimension to slice by

 

In our example below we want to extract from the AdventureWorksDW2012 Analysis Services Database

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Example reference:

·         We are going to get the distinct number of Customers (CustomerKey) from the [dbo].[FactInternetSales]

o    So this will enable us to get a distinct count based on any of the attributes from our Customer Dimension

·         NOTE: The thing to note is that you will always want to create your distinct calculation on your lowest granularity.

·         As with our example in our [dbo].[FactInternetSales] the lowest level was the Date or OrderDateKey.

·         In order to follow how this works, we will be using the Order date of 03 March 2008.

o    The distinct number of Customers on 03 March 2008 is 51 Customers.

 

Creating the Distinct Number of Customers Calculation

1.       The first thing is we will be creating our distinct calculation and then insert this into a staging table.

a.        NOTE: What you could do is to use your staging table later as part of the loading into your Fact Table.

                                                               i.      But with our example we are going to update our Fact Table with our calculations.

                                                              ii.      We also insert the data into a Staging table so that we can use it as part of our Update Statement.

2.       Next below is the TSQL syntax that we used to create our distinct number of Customers Calculation, with an explanation afterwards

Select

                      convert(float,Count(Distinct(CustomerKey)))/ Count(1)asDistinctCustomerKey

                     ,OrderDateKey

  FROM[AdventureWorksDW2012].[dbo].[FactInternetSales]with (nolock)

         GroupbyOrderDateKey

         OrderbyOrderDateKey

a.        The only part that we are going to concentrate on is the actual calculation.

b.       We will start the explanation from the inside out, due to this being the logical way that I built this up.

                                                               i.      The part highlighted in RED is where we first are selecting the Distinct CustomerKey

1.       Distinct(CustomerKey)

                                                              ii.      The next section highlighted in PURPLE is where we are doing a count of the Distinct CustomerKey

1.       Count(Distinct(CustomerKey))

                                                            iii.      The next section highlighted in GREEN is where we are converting our values to a Float

1.       convert(float,Count(Distinct(CustomerKey)))

                                                            iv.      The final part of the calculation is highlighted in BLUE, where we are dividing our distinct count by the number of rows.

1.       / Count(1)

2.       This is so that we can then get the correct calculation.

                                                              v.      We also have the OrderDateKey, because this is our lowest level of granularity, we need to group by this so that the number of rows is grouped per day to work out the calculation correctly.

                                                            vi.      NOTE: The reason that we are converting this to a float is so that later when we sum the row details it will sum back up to the correct number.

3.       For the simplicity to understand below is the complete TSQL Syntax where we will be truncating and inserting our data into our Staging table in our AdventureWorksDW2012 database

TruncateTabledbo.Staging_tb_FactInterNetSales_DistinctCustomerKey

Insertintodbo.Staging_tb_FactInterNetSales_DistinctCustomerKey

Select

                      convert(float,Count(Distinct(CustomerKey)))/Count(1)asDistinctCustomerKey

                     ,OrderDateKey

                    

  FROM[AdventureWorksDW2012].[dbo].[FactInternetSales]with (nolock)

         GroupbyOrderDateKey

         OrderbyOrderDateKey

a.        As you can see above we are also grouping the calculation by OrderDateKey so that we can use this to insert the calculation for multiple days.

4.       With our example explained above if we had to look at our calculation for our Distinct Customers on the 03 March 2008 it would be the following:

a.        clip_image002[4]

 

Updating our Fact Table with our Distinct Customer calculation

1.       The first thing is that you need to ensure that you have your column created in your Fact Table.

a.        I manually created the following column in the dbo.FactInternetSales Table:

Altertable[dbo].[FactInternetSales]

  AddDistinctCustomersFloatNULL

2.       Next is our Update TSQL Statement where we are updating the column we created in Step 1 above, with an explanation below:

Update[dbo].[FactInternetSales]

SetDistinctCustomers=DistinctCustomerKey

From[dbo].[FactInternetSales]asFwith (nolock)

       Innerjoindbo.Staging_tb_FactInterNetSales_DistinctCustomerKeyasSwith (nolock)

              onF.OrderDateKey=S.OrderDateKey

a.        From above you can see that we are joining from our Fact table to our Staging table and using the OrderDateKey in our join.

                                                               i.      NOTE: Once again this is our lowest level of granularity

3.       As with our example if we now have a look at our Fact Table for 03 March 2008 we will see the following for our DistinctCustomers column.

a.        clip_image004[4]

 

Adding the new Measure to your SSAS Cube

1.       Open up your SSAS Project in SQL Server Data Tools (SSDT)

2.       Next open the Data Source View and refresh it.

a.        As our example we opened up the Adventure Works DW.dsv

b.       Once open we clicked Refresh, which you will then see the following:

                                                               i.      clip_image006[4]

c.        Click Ok.

3.       Next open your cube where you want to add your new Distinct Measure to.

a.        As with our example we opened the Adventure Works cube

                                                               i.      clip_image008[4]

4.       Under Measures if you click on the plus sign, and next to Internet Sales, right click and select New Measure

a.        clip_image010[3]

5.       This will open the New Measure Window

a.        Now as shown below under Usage: Sum

b.       Source Table: Internet Sales Facts

c.        Source Column: DistinctCustomers

d.       clip_image012[3]

e.       Then click Ok.

6.       You will now see your measure under the Internet Sales Measure Group

a.        clip_image014[3]

b.       Now we are going to rename the Distinct Customers measure so that we can use this name in our calculation member which will be explained in the next steps.

                                                               i.      NOTE: The reason we are creating a calculation is because we need to round up our values from our database in order to make the distinct count a whole number.

c.        Right click on the Distinct Customers Measure and select rename

                                                               i.      clip_image016[3]

d.       We are renaming it to Distinct Customers – Calc

                                                               i.      NOTE: The reason that we give it this name is so that we know it is used as part of a calculation.

e.       The final thing to do is to right click on our Distinct Customers – Calc and select properties.

                                                               i.      Then change the Visible Property to False

                                                              ii.      clip_image018[3]

f.         Then save your cube.

 

Creating a calculation in SSAS and rounding the value so that it will be shown as a whole number

1.       The final step is to create our calculated member so that when we display the values to the client tool that the numbers look correct.

2.       Click on the Calculations Tab.

3.       Then click on New Calculated Member

a.        clip_image020[3]

4.       We then configured it with the following as shown below:

a.        Name: [Distinct Customers]

b.       Parent Hierarchy: Measures

c.  Expression: Round([Measures].[Distinct Customers – Calc],0)

                                                               i.      NOTE: Here we are using the Round function and setting it to not keep any decimal points by specifying the zero (0).

1.       This will also enable if the value is higher than 0.5 to round up to 1.

2.       This is so that when the Calculated Measure is displayed it will always be a whole number.

d.       Format String: Standard

e.       Visible: True

f.         Non-empty behavior: Distinct Customers – Calc

g.        Associated Measure Group: Internet Sales

h.       clip_image022[3]

5.       Now finally process your cube.

a.        NOTE: If you are using the Adventure Works MultiDimensional project in SSDT, they are by default set to Query binding for the partitions and due to this you will have to add our new column to the following Measure Groups:

                                                               i.      Part to Add:

,[dbo].[FactInternetSales].[DistinctCustomers]

                                                              ii.      Measure Group partitions:

1.       clip_image024[3]

                                                            iii.      If you do not add the column name in there the processing will fail.

 

Viewing Distinct Customer Counts

1.       Finally we now can view our new Distinct Count Measure that we created.

2.       As with our example I am expecting to see for 03 March 2008 the Distinct Customers to be 51

a.        clip_image026[3]

3.       And this was the goal of this exercise to enable to get a Distinct Count that can be used on our Dimensions using the SUM Function

 

Getting a distinct count for another dimension

If you wanted to get a distinct count for another dimension, you would then need to create another column as well as all of the steps above.

 

1.       For our above example if we wanted to get the distinct count of Currency (CurrencyKey) we would need to modify our query to include the following:

TruncateTabledbo.Staging_tb_FactInterNetSales_DistinctCustomerKey

Insertintodbo.Staging_tb_FactInterNetSales_DistinctCustomerKey

Select

                      convert(float,Count(Distinct(CustomerKey)))/Count(1)asDistinctCurrencyKey

                      ,CurrencyKey

                     ,OrderDateKey

                    

  FROM[AdventureWorksDW2012].[dbo].[FactInternetSales]with (nolock)

         GroupbyOrderDateKey,CurrencyKey

         OrderbyOrderDateKey

a.         

2.       And then we would create all the column names and details in SSAS as detailed above.

Win Free Ecopy of new book on SSAS

Readers would be pleased to know that I have teamed up with Packt Publishing to organize a Giveaway of the Expert Cube Development with SSAS Multidimensional Models

 

And two lucky winners stand a chance to win ecopy of their new book. Keep reading to find out how you can be one of the Lucky Winners.

Overview:

                              clip_image001 

 

·         Build a data mart suitable for use with Analysis Services

·         Work with a thread pool effectively

·         Create and configure an Analysis Services project in SQL Server Data Tools

·         Use the Dimension Wizard and the Dimension Editor to build dimensions

·         Create measure groups and associate them with dimensions

·         Design cubes and dimensions and also implement common calculations in MDX

·         Explore the security model, including dimension security and cell security, and implement dynamic security

·         Tune queries to get the best possible performance

·         Automate processing and partition creation

·         Monitor your cube to see who’s actually using it

 

How to Enter?

All you need to do is head on over to the book page (Expert Cube Development with SSAS Multidimensional Models) and look through the product description of the book and drop a line via the comments below this post to let us know what interests you the most about this book. It’s that simple.

Deadline

 

The contest will close on 19 March 2014. Winners will be contacted by email, so be sure to use your real email address when you comment!

 

SSAS – Configuring the Slice Partition in your cube.

After recently going through an online video, I realized that I had not set the Slice option on the partition on a larger cube that I had. And by putting in the slice details this would then enable me to get faster queries when people were accessing the data.

 

Below are the steps that I did in order to configure the Slice details for a partition on your cube.

 

Example:

·         Configuring the slice for the Partition in our Adventure Works DW 2012, for each year that we have created a partition.

 

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

1.       The first thing that I had to ensure that was for each Partition in my cube, the corresponding table only had the correct date values.

a.        NOTE: In my actual data for some reason this was not 100% correct, so I ensured that each underlying SQL Partition had the valid dates in each configured partition.

2.       Next open up SQL Server Data Tools (SSDT) or in SQL Server Management Studio (SSMS).

3.       Then go to your Partition Window and click on a specific Partition

a.        In our example we went into our Partition called:

                                                               i.      clip_image002

b.       Then you can either right click to go into the Properties or double click.

4.       Next you are looking for the section that says Slice

a.        clip_image004

b.       Click on the Ellipses to go into the Partition Slice – Total_Sales_2005

5.       Next in order to configure our slice, we need to see how the data has been partitioned.

a.        As with our example it is partitioned by Year.

b.       So from our Metadata we can go down to Date click on the Date, then Calendar, then the Calendar Hierarchy, then Calendar Year.

                                                               i.      Now select CY 2005 and drag this into the Expression.

c.        It will now look like the following:

d.       clip_image006

e.       Click on Check to ensure that your syntax is correct.

6.       Click Ok twice to go back to your SSDT or SSMS

7.       Now Process the Partition

8.       It should now come back with the following:

a.        clip_image008

9.       Now when you run your MDX query you should see the following when querying data from 2005

a.        This is the query I ran

Selectnonempty {[Measures].[Sales Amount]} on 0

,nonempty {

[Date].[Date].&[20050701]:[Date].[Date].&[20050731]

 

} on 1

from [Adventure Works]

b.        

10.    And this was the result from Profiler

a.        clip_image010

Which we can now see only used the Total_Sales_2005 Partition

DAX – Comparing a specific Product Average against the average of ALL products

I had a situation when one of my fellow workers wanted to create a report where they wanted to compare one product’s sales and how it was doing when compared to the average of ALL the product sales.

 

In our example below we want to extract from the AdventureWorksDW2012 Analysis Services Database

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

       ·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Getting data from the OLAP Cube into the Tabular model

The first thing that I did was to get my data from the OLAP cube into the tabular model.

 

1.       Below is the query that I used to get the data.

Selectnonempty {[Measures].[Sales Amount]} on 0

,nonempty {(

                                           [Product].[Product].children

                                         ,[Date].[Date].Children

                      

                     )} on 1

from [Adventure Works]

a.        As you can see from above this is a nice and simple MDX query.

2.       Next to get the data into the Tabular model you can refer to this following blog post that I did and follow steps under SSAS Tabular and then complete steps 1 – 3

a.        http://gqbi.wordpress.com/2013/12/11/power-view-using-mdx-ssas-tabular-and-power-view-to-display-potential-issues-based-on-past-trends-in-data/

3.       Now you will have your data in your Tabular Model.

4.       As shown below we just changed the column names:

a.        clip_image002[4]

b.       We then also changed the data type for the Date column to date

c.        And then changed the Data Format to Short Date

                                                                                       i.      clip_image004[4]

 

Creating the measures so that we can get the Product Average and Overall Product average

As per our example we wanted to compare a specific Products average against the average of all the products combined. And below we will show you how we created this.

 

NOTE: The thing to note here is that with the measures below you can ONLY use the Products column to slice on in your report.

·         This is because if you use another column your measure for Overall Product Average below are looking ONLY at the products column so the overall average will be incorrect.

 

1.       First we created the Product average measure which is as follows:

Product Average:=AVERAGE(‘Sales Amounts'[MeasuresSales Amount])

2.       Next this is where we created our Overall Product Average measure:

Overall Product Average:=CALCULATE(AVERAGE(‘Sales Amounts'[MeasuresSales Amount]);ALL(‘Sales Amounts'[Product]))

a.        The above measure can be explained with the following:

                                                                                       i.      CALCULATE

1.       This function is used to evaluate the expressions contained within the context.

                                                                                     ii.      AVERAGE(‘Sales Amounts'[MeasuresSales Amount])

1.       This is as per our first measure is used to calculate the average

                                                                                    iii.      ;ALL(‘Sales Amounts'[Product]))

1.       The ALL DAX function returns all the data in your table or column.

2.       So in our example we are getting ALL the data in our Products column.

b.       NOTE: The first thing to note here is that when using the ALL DAX function if you do not put the fully qualified column name when trying to create this measure you will get the following error as shown below:

                                                                                       i.      clip_image006[4]

                                                                                      ii.      So make sure you always use the Table name as well as the column name in all your measures. This can save you a lot of time trying to troubleshoot an error, when it was just a fully qualified column name.

3.       Now you will see the 2 measures below:

a.        clip_image008[4]

b.       NOTE: As it looks above they look identical, but this is because currently there are no Products to compare.

4.       Now when we put the above into Excel so that we can compare specific products you can see how the products have performed when comparing them to the overall product average:

clip_image010

 

Power View – Using MDX, SSAS Tabular and Power View to display potential issues based on past trends in data

What I am going to explain below is using the above MDX, SSAS Tabular and Power View to display potential issues based on past trends within our data.

 

For our example we are going to using the example of SCOM data. Within our example we are looking for systems where the available memory has dropped more than 50% of what it was previously sitting at.

·         So for example if a server when from 1024 MB of Available Memory and dropped to 400MB we would want to see this.

·         And if our SCOM Alerts only looked for servers with less than 200MB of Available Memory it would not have alerted yet.

 

This is because it could highlight something going on a server that is not currently an issue but COULD become an issue.

 

Whilst the final Power View Report might look very easy to understand and simple, it is really valuable and enables the end users to quickly identify issues.

 

NOTE: With the above information below it has not breached the SCOM threshold, so no alerts have been created. But by the time the SCOM Alter comes through it might be too little too late.

 

NOTE: You can download, install and configure the SCOM data warehouse and cube from the following location, which I have previously blogged about:

http://gqbi.wordpress.com/2013/11/14/scom-systems-center-operations-manager-cube-and-data-warehouse/

 

MDX Query – SCOM Cube

The first part is to create and write the MDX Query which we will later use in our Tabular Model in order to get the data we require.

 

1.       As with our example we created our MDX query to get the output we required.

2.       The first query that we have below is where we are looking for Servers with a memory issue as explained above

Withmember [Measures].[PreviousWeekAverage] as

       Avg((strtomember(“[Date].[full Date].&[“+format(now()-7,“yyyy-MM-dd”)+“]”):strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”)),[Measures].[SampleValue])

      

       member [Measures].[CurrentDayAverage] as

       (strtomember(“[Date].[full Date].&[“+format(now(),“yyyy-MM-dd”)+“]”),[Measures].[SampleValue])

 

       member [Measures].[Diff] as

       ((([Measures].[CurrentDayAverage]-[Measures].[PreviousWeekAverage])/[Measures].[PreviousWeekAverage])*100)

 

Selectnonempty {[Measures].[CurrentDayAverage],[Measures].[PreviousWeekAverage],[Measures].[Diff] } onColumns,

nonempty {

                     Filter(

                      Filter(

                       Filter(

                           ([Counter].[Counter Hierarchy].[Counter Name].&[Memory]&[Available MBytes]

                           ,[Managed Entity].[Display Name].children,strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”))

                       ,[Measures].[CurrentDayAverage]>0)

                      ,[Measures].[Diff] > -100000)

                     ,[Measures].[Diff] < -50)

                     } onRows

From [SCOM Daily]

a.        As you can see from the above query we have create the following:

                                                                                       i.      Our Measures to compare the current days average to the past weeks average.

                                                                                      ii.      We then have put in 3 filters so that it will only bring back servers where the Available Memory has dropped more than 50%, as well as ensure that we do not get any random rows where the SCOM data is corrupt or out of whack.

b.       NOTE: This query will return just the specific Display Names or Server Names.

3.       In our next query we are basing it on the query above, but this time we are bringing down all the detailed data, so that this can be put into Power View in order to see the history.

Withmember [Measures].[PreviousWeekAverage] as

       Avg((strtomember(“[Date].[full Date].&[“+format(now()-7,“yyyy-MM-dd”)+“]”):strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”)),[Measures].[SampleValue])

      

       member [Measures].[CurrentDayAverage] as

       (strtomember(“[Date].[full Date].&[“+format(now(),“yyyy-MM-dd”)+“]”),[Measures].[SampleValue])

 

       member [Measures].[Diff] as

       ((([Measures].[CurrentDayAverage]-[Measures].[PreviousWeekAverage])/[Measures].[PreviousWeekAverage])*100)

 

Selectnonempty { [Measures].[SampleValue]} onColumns,

nonempty {

                     Filter(

                      Filter(

                       Filter(

                           ([Counter].[Counter Hierarchy].[Counter Name].&[Memory]&[Available MBytes]

                           ,[Managed Entity].[Display Name].children

                           ,strtomember(“[Date].[full Date].&[“+format(now()-15,“yyyy-MM-dd”)+“]”):strtomember(“[Date].[full Date].&[“+format(now()-1,“yyyy-MM-dd”)+“]”))

                       ,[Measures].[CurrentDayAverage]>0)

                      ,[Measures].[Diff] > -100000)

                     ,[Measures].[Diff] < -50)

                     } onRows

From [SCOM Daily]

a.        Even though the queries look very similar there are differences.

                                                                                       i.      In the actual Select statement we are now only putting in the actual Measure of the Sample Value.

                                                                                      ii.      We then have also put in our date range, so that we can get this history for the past 15 days.

4.       We will then put these queries into SSAS Tabular in the next steps.

 

 SSAS Tabular

Next we are going to create our Tabular model in SSAS so that we can then use this in order to create our Power View Report.

 

1.       Open SQL Server Data Tools (SSDT) and create a new Analysis Services Tabular Project

a.        clip_image002

b.       We have it the name:

                                                                                       i.      SCOM Potential Server Issues

2.       Create a new Import Data Source and select the following:

a.        clip_image004

b.       Then configure this to connect to your SSAS Instance.

                                                                                       i.      With our example we connected to our SSAS Instance where our SCOM Daily cube is.

c.        Next we configured which account for the Impersonation Information.

d.       Next is where we will put our first MDX Query from Step 2 above.

                                                                                       i.      As you can see below with our SCOM example:

                                                                       clip_image006

3.       We then clicked Finish to import the data.

4.       Now in the model designer we renamed our columns so that they would be more descriptive. As Shown with our example below:

a.        clip_image008

b.       We also created a measure called:

                                                                                       i.      Percentage Available Mbytes

                                                                                      ii.      NOTE: The reason that we use this is so that we can use it in our Power View Report later

c.        And we also formatted the Full Date column with the Short Date.

5.       Next we are going to use our second query from step 3 above for the detailed information.

a.        As with our example this is the actual information from SCOM.

b.       We once again renamed the columns and formatted them as per our requirements.

c.        clip_image010

d.       We also created a measure called:

                                                                                       i.      Available Mbytes

                                                                                      ii.      NOTE: The reason that we use this is so that we can use it in our Power View Report later

e.       And we also formatted the Full Date column with the Short Date

6.       Now the crucial step here is where we created the relationship between the two queries.

a.        NOTE: The reason that we do this is so that later when creating the Power View Report you can then select the server name and it will show you the history on the same chart.

b.       In SSDT click on the Diagram View

                                                                                       i.      clip_image012

c.        As with our example we had to create a relationship within our SCOM Data.

d.       Then create the relationship between the Display Names in the two queries as shown below

                                                                                       i.      clip_image014

e.       NOTE: This will work because in our Sheet called “SCOM_Available_Memory_Overview” there is only unique Display Names.

7.       Now you would deploy your SSAS Tabular model to your Server.

 

Power View Report

Now what we are going to do is to create our Power View Report based on our SSAS Tabular Project.

 

1.       All the steps below are based on our SCOM Example, which you could incorporate into your solution.

2.       Log into your SharePoint Server and create your connection file to your SSAS Tabular database.

3.       Then click on your connection file to create your Power View Report.

4.       First we configured it from the SCOM_Available_Memory_Overview

a.        clip_image016

b.       Next we configured the overview for the Available Mbytes with the following in the created as a bar chart.

c.        clip_image018

d.       Once the bar chart has been configured we then change the sorting to be the following below:

                                                                                       i.      clip_image020

e.       NOTE: The reason for this is because you want to see the systems that have had the largest drop at the top.

f.         Once completed it looked like the following:

                                                                        clip_image022

5.       Next we create the detailed view below using the SCOM_Available_Memory_Detailed

a.        clip_image024

b.       Next we configured this to use the Line Chart

                                                                                       i.      clip_image026

c.        This is what it will look like once completed.

      clip_image028

6.       Finally the trick that makes this work so well is that if you click on the bar chart at the top it will then only show the relevant line chart with the details below.

a.        So for example if you clicked on the FileServer.Domain.com you can see that it has the largest Memory drop as shown below.

                                                                                     clip_image030 

b.       The thing to noteis that SQLServer.Domain.COM on the first chart looks like there was not a large drop in memory. But if you click on the bar chart for SQLServer.Domain.COM you will see that the detailed view now shows a much more meaningful picture.

                                                                                       i.      NOTE: Look at the Orange line in step 5 above.

                                                                               clip_image032

7.       Now Save your Power View Chart.

8.       And finally if required create an SSIS Package to refresh your tabular data as required.