BI-NSIGHT – SQL Server 2016 CTP 3.0 (SSAS, SSRS, SSIS) – Power BI (Chiclet Visual, SparkPost Content Pack, Weekly Service Update, Personal Gateway Update, Tiles in SharePoint)

I expected this week to be a really interesting week with SQL Pass happening. As I was sure to see some really good and interesting updates from Microsoft and it sure is living up to this.

There has been a lot of information on Twitter and on other blogs, so here is my take on the developments.

SQL Server 2016 CTP 3.0 (SQL Server Database Engine, SQL Server Analysis Services, SQL Server Reporting Services, SQL Server Integration Services)

There was a whole host up dates with SQL Server 2016 CTP 3.0, which is great to see, as well as some announcements of what we can expect in subsequent releases.

I am just going to highlight below what I think is relevant in the BI space. But there will be links below where you can find the related blog posts, which have more information from the Microsoft teams.

SSAS

With regards to SSAS, it is good to see how much effort and work is going into the Tabular model. Which is what I thought would be the case.

I think that it is really great to see that they have changed the underlying structure from XMLA to JSON. The way that I see it, this is how they have implemented Power BI in terms of having the SSAS database sitting in memory in Azure. And without a doubt I am sure that they have learnt a lot, and from this they can then leverage this and bring it into the On Premise product. We all know how fast it is online!

The MDX Support for Direct Query is also a great update. I can see a lot of people leveraging this, and when you partner this with APS you can pretty much start to enable real-time analytics. Which can be a real game changer.

All the other updates that are coming into SSAS have mostly been completed either in Power BI Desktop or in Excel 2016. So it is great to see this in the Server product which will go a long way to ensure that it can scale and perform for enterprise workloads.

SSRS

I have eagerly been waiting to see what was going to happen in the SSRS space. And whilst I had seen some of the now released information it is great to see it being released to the general public. As well as how well it has been received.

The pinning of SSRS reports into Power BI is a really smart move. And the ability to also refresh this report in Power BI is pure Genius. What this means now is you can leverage both of your On Premise and cloud investments. And to the users this will be seamless.

What I also really like is that you can often create really interesting SSRS reports, and the executives and high level managers do not need to see the details. They just want an overview. And now by leveraging this all into Power BI, it becomes their one stop shop!

SSIS

There does not seem to have been a lot of love for SSIS, and to be honest it is a stable and really good product.

But what I did see is the Control Flow Template, and I am hoping that this is something similar to what you can currently do with BIML. What that is how I perceived it to be. And I am hoping that you can create different control flow templates for different control flows. So for example you could create a control flow template for a SCD Type 2. And then once you have it designed the way that you want, any other developers can then utilize it. This would go a long way in enterprises where you want to standardize the way of doing things.

You can read about all of the above here:

Power BI – Chiclet Visual Slicer

The one thing that I have been struggling with in Power BI was how to get a slicer to work, so that it looked good.

And low and behold there is a new visualization which can how do this. And to have it with images also is really smart. As people love to click on Images.

Another great announcement was from James Phillips that Microsoft would be releasing a new visualization every month, indefinitely. This is really great and I am sure that we will see some really interesting and useful visualizations in the future.

You can read all about it here: Visual Awesomeness Unlocked: The Chiclet Slicer

Power BI – SparkPost Content Pack

This week there is another interesting and great Content Pack. This time for SparkPost. Which you can now use to monitor your Email campaigns.

You can read about it here: Monitor Your SparkPost data with Power BI

Power BI – Weekly Service Update

Not only was there a host of announcements at SQL Pass, there was the weekly Power BI Service update.

Once again I am going to quickly highlight what there is in this week’s update.

They have made quite a few improvements with regards to the way we can share the dashboards in Power BI. All of these updates make it a lot easier to share the dashboard and to enable people to see how good Power BI is. The additions are (Sharing the Dashboards with AD Groups, People Picker and Sharing with a large number of Email addresses)

Along with this is the ability to start passing parameters into the URL. I have no doubt that passing URL parameters will keep on increasing and giving additional flexibility in the Power BI service.

You can read about it here: Power BI Weekly Service Update

Power BI – Personal Gateway Update

There was an update late last week for the Power BI Personal Gateway and it is mostly around bug fixes and performance improvements. Which is great to see because I do know that often we want it to run as smoothly and quickly as possible

You can find more information here: New version of Personal Gateway is now live!

Power BI – Tiles in SharePoint

And finally the guys from DevScope have now created a Power BI Tile for SharePoint.

I think that this will work really well, because it will give the ability to showcase all the work done in your Power BI reports, as well as not having to re-create reports over and over again.

If you want to find more details and pricing, you can find it here: Power BI Tiles for SharePoint

BI-NSIGHT – Power BI (Desktop Update, Service Updates, API Updates, Mobile App Update, Visual Contest Results, Content Pack – Stripe) – Office 2016 (Excel Updates, Power Query Update, Excel Predictions) – SQL Server Analysis Service 2012 Tabular Update – SQL Server Analysis Services 2016 Extended Events

So this week there was once again a whole host of updates with Power BI, as well as finally the official release of Office 2016.

It sure is a busy time be in Business Intelligence space, especially in the Microsoft space.

So let’s get into it…

Power BI – Desktop Update

So I woke up this morning to see that there has been a massive release in the Power BI Desktop application. I immediately downloaded and installed the update. I have already used some of the features today.

And there was a whole host of updates, too many to go through all of them here, but I would just like to highlight the ones which I think are really great additional features.

I am really enjoying the Report Authoring features, and I have mentioned it before but the drill up and drill down features are really great and allows for more details to be in the report which you will not initially see on face value.

Then under the data modelling section I have to say that I am currently not any DAX guru, but I do appreciate how powerful it is, and how you can really extend your data with so many DAX functions. In particular is the Calculated Table, which Chris Webb has already blogged about and has some great information here: Calculated Tables In Power BI

And there are some great new features with regards to Data Connectivity as well as Data Transformations & Query Editor improvements, which all forms part of Power Query. Which once again enables the author of the reports to enrich the data, which in turn will create great visualizations.

You can find out all about all 44 updates here: 44 New Features in the Power BI Desktop September Update

Power BI – Service Updates

So yet another great update on the Power BI platform.

I think that finally being able to customize the size of the tiles is really good. So that you can fit more meaningful information on your dashboard.

Another great service update is to be able to Share Read Only dashboards with other users. This is great because often you create dashboard and reports, which you want to share with users and let them interact with your data, but not make any changes.

As well as having more sample content packs which will be a great way to showcase how powerful Power BI is.

You can read about it and all other updates here: Power BI Weekly Service Update

Power BI – API Updates

There is no pretty picture for the API updates, but there are some great new features.

The one that I think is really good is the ability to be able to embed a Power BI tile into an application. So at least it gives you the ability to have the great features of Power BI in your application without having to go directly to Power BI.

I also see that there is the ability from what I can see to pass some filters or parameters into the Power BI report via the URL which is really good and can prove to extend the functionality.

These are the details in the API updates

  • Imports API
  • Dashboards API
  • Tiles and Tile API
  • Groups API
  • Integrating Tiles into Applications
  • Filtering Tiles integrated into your Application

You can find out all about it here: Power BI API updates roundup

Power BI – Another Mobile Update

The Microsoft team must be working 24 hours a day with the amounts of updates and additions that are coming out from Microsoft.

They have released some additional updates into the Mobile application which are great, as we all are well aware that having a mobile application really can help showcase your solution. As well as ensure that it gets to the right users and that they can see the related information.

You can find out about the updates to the IOS, Windows and Android details here: Power BI mobile Mid-September updates are here

Power BI – Visual Contest Results – People’s Choice Awards

As you can see above this is the first people’s choice award for the Visual Contest result, which I can see myself using that with my existing data.

You can find out about it and the other entries here: Power BI Best Visual Contest – 1st People’s Choice Award!

Power BI – Content Pack Stripe

Once again this was another great content pack update this week.

There are a whole host of people who are using the Stripe platform payment for their online business. From the really small guys to the big guys. And this gives everyone a really great and easy way to understand and visualize your data. As well as what payments you are getting.

You can find out more about it here: Monitor and Explore your Stripe data in Power BI

Office 2016

I am very happy to see that Office 2016, has been released before the actual year of 2016.

I have mainly been focused on all the features in Excel, due to being in BI. But there are a whole host of updates, fixes and new additions in Office 2016.

You can read all about it and all the details here: The new Office is here

Excel 2016 – Features

With Office 2016 being released this blog post from the Office team shows a lot of the great features that are available in Excel 2016.

It does showcase a lot of great features focused on Business Analysts, as well as how people can leverage all the new features in Excel 2016.

You can read up all about it here: New ways to get the Excel business analytics features you need

Power Query Update

With so many things going on within Power BI, there has been another great release with Power Query.

As you can see with the picture above it is great to finally have the ability to write your own custom MDX or DAX query to get the data which you require from your SSAS source.

Another great feature is the ability to extract a query from one of the steps within your current Power Query query, and then you can use this in another Power Query window. As they say it gives you the ability really easily use the same code over again, without having to do it all over again.

You can read all about it here: Power Query for Excel September 2015 update

Excel Future Prediction

I recently came across a really interesting article from some of the Industry experts within the BI space, and for them to predict how they see Excel’s use as well as where they see it fitting into the BI space in the upcoming years.

There were some really insightful and interesting details, which made me think about how Excel has evolved over the years, and with the current additional investments going into Excel, how this is going to be leveraged and improved in the years to come.

You can read all about their thoughts here: 27 MICROSOFT EXCEL EXPERTS PREDICT THE FUTURE OF EXCEL IN BUSINESS INTELLIGENCE

SQL Server Analysis Service 2012 – Tabular Update

There has been a CU update for SQL Server 2012, and one of the great updates relates to SSAS Tabular for columns that have a high cardinality. Which was a performance issue before this release.

It is great to see that this has been addressed, especially due to the fact that in SSAS Tabular there will be cases when columns will have a high cardinality. And even though it is often super quick, we would like everything to be as fast as possible.

You can read all about the updates to SQL Server CU 8 here: Cumulative update package 8 for SQL Server 2012

SQL Server Analysis Services 2016 – Extended Events

I think that it is great to see that we are finally getting some additional features and updates to Analysis Services.

When I read up about the extended events, this is something really great to see. I actually have been in an exercise to monitor what has been going on our SSAS instance both in terms of performance, as well as which users are accessing the cubes and what they are doing. And currently there is not a super elegant solution to achieve this.

With the extended events this makes it a lot easier and gives you the ability to quickly get the information that you require.

I also love it that you are able to have a live query, which you can use to see if you are specifically running something. As well as if you want to ensure that you are capturing the right events.

This is definitely something that I will be looking to use when we finally can install and use SQL 2016.

You can find out about all the details here: Using Extended Events with SQL Server Analysis Services 2016 CTP 2.3

Power BI (Visual Contest, On the Go, Hyperlinks, Drill Up, Drill Down, tyGraph Content Pack) – SQL Server 2016 – CTP 2.3 (SSAS Tabular with a dose of speed, SQL Server Data Tools (SSDT), SQL Server Reporting Services (SSRS)) – Power BI (Personal Gateway for Power BI Update)

Right once again there is a lot to get into this week, and as with every week there are a whole host of Power BI updates.

Power BI – Visual Contest

I have to say that this is both a smart and fantastic move by Microsoft. This allows them to get or gain a whole host of new chart types that can be consumed within Power BI, without having to spend a lot of development time getting it all completed.

As well as there are a lot of smart people that have some great idea’s. And this gives them a great platform to showcase their idea’s. As well as get some recognition for their efforts.

As you can see in the above screenshot, this is a great KPI example. To me it does look similar to the Datazen KPI’s. But I do know that it would be welcome in Power BI.

You can read all about the contest details here: Announcing the Power BI Best Visual Contest!

Power BI – On the Go (Mobile Apps)

There has been another great update to the Mobile Apps for Power BI. And it is across all the current mobile platforms that are supported.

I like the idea that you can keep your favorites and have them on a dashboard within the Power BI Mobile application. This gives a very similar experience as with the Web based Power BI. Which is great when you potentially want to see data from different sources.

I also like the fact that you can enable Data Alert Rules, which means you can get alerts on your mobile device when the thresholds that you have set are exceeded. And just means that you do not have to go and keep on checking on reports or data. A much more proactive means of being notified.

There are some additional details which you can read about here: Power BI on the Go

Power BI – Hyperlinks, Drill Down and Drill Up

Once again this week the Power BI team has been very busy and has a whole host of updates to the Power BI Service.

Firstly, is the ability to put inline Hyperlinks, which I think is really great and will get the report consumer a much better and more streamlined experience.

The thing that I think is really fantastic is to have the ability to Drill Down as well as Drill up within the report. They have done another amazing job in making the process so simple to implement. As well as very easy to use when using the report. And I know that very often users ask if there is the ability to Drill Down. I also like the fact that you can then filter on your information once you have drill down. Which makes the entire report experience easy, quick as well as ensure that the users are really happy.

You can read all about it here: Power BI Weekly Service Update

Power BI – Content Pack tyGraph

Another week, another great Content Pack.

This time it is tyGraph, which is something that can be used or reported on if you use Yammer. Which I know more companies are looking to use, especially if it is part of your Office 365 Subscription.

You can read all about it here: Analyze and Monitor your tyGraph Data with Power BI

SQL Server 2016 – CTP 2.3 – SSAS Tabular with Direct Query

This was a very interesting blog post to read, due to the fact that in the past I never really thought of using the DirectQuery mode with SSAS OLAP or Tabular, due to the fact that in the past it did not perform well or fast.

From the blog post they have made some significant improvements in SQL Server 2016 CTP 2.3 And it is nice to see that SSAS is finally getting some attention. As well as looking at this example it would allow the report to be run and executed in real time. So if your source data is being updated on a regular interval, it means that going via SSAS Tabular to your Source SQL System you would be able to get up to date data. As well as it being so much quicker this really has the potential to be a game changer for a lot of organizations.

You can read all about it, as well as the details here: SQL Server Analysis Service 2016 CTP 2.3 DirectQuery in action

SQL Server 2016 – CTP 2.3 – SQL Server Data Tools (SSDT)

Just a quick note that they have updated how SSDT will be working going forward.

It appears that they have unified the setup for Database as well as BI (Business Intelligence). It makes perfect sense as the two work hand in hand.

You can read about it here: SQL Server Data Tools Preview update for August 2015

SQL Server 2016 – CTP 2.3 – SQL Server Reporting Services (SSRS)

It is nice to finally some actual changes and new things happening in SSRS. I am sure I am not the only one that agrees that it has almost been too long for SSRS to get an update. I personally was getting to the point where I was using SSRS as a last resort. But with all the upcoming changes I am sure that looking ahead this will become another alternative or an option for a reporting platform.

As you can see from above, it looks to me as if it will be along the similar lines of Power BI Desktop. Which I think is great so that report developers as well as people using Power BI will be used to a similar reporting experience.

It is also good to see, due to changing the way that they have created the new version of SSRS, that it is supported on pretty much any browser.

You can find out more about it here: What’s New in Reporting Services in SQL Server 2016 CTP 2.3

Power BI – Personal Gateway Update

It is great to see another update to the Power BI Personal Gateway. Along with the updates to SSAS OLAP (Multidimensional & Tabular) as well as the support for Custom ODBC drivers. Which means that you can connect to almost any database source.

This is a great alternative to ensure that your data in your Power BI reports are being updated and relevant. Which as I was writing about earlier, if your users are using the Power BI Mobile app, it then means that they can get notifications on the fly. Which can enable them to make the right decisions at the right time.

You can find out more information here: New version of Personal Gateway now available!

BI-NSIGHT – SQL Server 2016 – Power BI Updates – Microsoft Azure Stack

Well I have to admit that it seems that the Microsoft machine has been working flat out to get out new products and updates.

If my memory serves me, this is the third week in a row that Microsoft has released new products and updates. I am really enjoying it! But hopefully this will slow down, so that we can catch our breath and actually play with some of the new products and features.

So let’s get into it. There is quite a lot to go over!!

SQL Server 2016

So with Microsoft Ignite happening this week, the wonderful guys from Microsoft have started to announce what we can expect to be in the next version of SQL Server.

I am going to focus mainly on the BI (Business Intelligence) features, and there are quite a few! Some of what I am detailing below I have only seen pictures on Twitter, or I have read about it. As well as the preview not even being released yet, I am sure that there will be some changes down the line.

SQL Server Reporting Services

It finally appears that Microsoft has been listening to our cries and requests for updates in SSRS (SQL Server Reporting Services)!

Built-in R Analytics

I think that this is really amazing, even though I am not a data scientist, I think it is a smart move my Microsoft to include this. What this means in my mind is that you can now get the data scientists to interact and test their R scripts against the data as it sits in the transactional environment. And from there, this could then be used to create amazing possibilities within SSRS.

Power Query included in SSRS

From what I have seen people tweeting about as well as what I have read, it would appear that Power Query will be included in SSRS. This is fantastic and will mean now that virtually any data source can be consumed into SSRS.

New Parameter Panel, chart types and design

I am sure we can all agree, that SSRS has required an overhaul for some time. And it seems that finally we are going to get this. It would appear that the parameters panel is going to be updated. I do hope that it will be more interactive and in a way react similar to the way the slicers do in Excel. As well as getting new chart types. Here again I am going to assume that it will be similar to what we have in Power BI!

And finally there was also mention that the reports will be rendered quicker, as well as having a better design. I also am hoping that they will deploy this using HTML 5, so that it can then be viewed natively on any device. It is going to be interesting to see what Microsoft will incorporate from their acquisition of Datazen.

SQL Server Engine

Built-in PolyBase

Once again, this is an amazing feature which I think has boundless potential. By putting this into the SQL Server Engine this means that it is a whole lot simpler to query unstructured data. Using TSQL to query this data means that for a lot of people who have invested time and effort into SQL Server, now can leverage this using PolyBase. And along with this, you do not have to extract the data from Hadoop or another format, into a table to query it. You can query it directly and then insert the rows into a table. Which means development time is that much quicker.

Real-time Operational Analytics & In-Memory OLTP

Once again the guys at Microsoft have been able to leverage off their existing findings with regards to In Memory OLTP and the column store index. And they have mentioned in their testing that this amounts to 30x improvement with In-memory OLTP as well as up to 100x for In-Memory Column store. This is really amazing and makes everything run that much quicker.

On a side note, I did read this article today with regards to SAP: 85% of SAP licensees uncommitted to new cloud-based S/4HANA

I do find this very interesting if you read the article. What it mentions is that firstly 85% of current SAP customers will not likely deploy to the new S/4HAHA cloud platform. Which in itself does not tend well for SAP.

But what I found very interesting is that to make the change would require companies to almost start again for this implementation. In any business where time is money, this is a significant investment.

When I compare this to what Microsoft has done with the In-Memory tables and column store indexes, where they can be used interchangeably, as well as there is some additional work required. On the whole it is quick and easy to make the changes. Then you couple this with what Microsoft has been doing with Microsoft Azure and it makes it so easy to make the smart choice!

SSAS (SQL Server Analysis Services)

I am happy to say that at least SSAS is getting some attention to! There were not a lot of details but what I did read is that SSAS will be getting an upgrade in Performance usability and scalability.

I am also hoping that there will be some additional functionality in both SSAS OLAP and Tabular.

SSIS (SQL Server Integration Services)

Within SSIS, there are also some new features, namely they are also going to be integrating Power Query into SSIS. This is once again wonderful news, as it means now that SSIS can also get data from virtually any source!

Power BI

Once again the guys within the Power BI team have been really busy and below is what I have seen and read about in terms of what has been happening within Power BI

Office 365 Content Pack

I would say that there are a lot of businesses that are using Office 365 in some form or other. So it makes perfect sense for Microsoft to release a content pack for Office 365 Administration

As you can see from the screenshot below, it gives a quick overview on the dashboard to see what activity is happening. As well as details of other services. I am sure that this will make a quick overview of your Office 365 systems really easy to see. And also if there are any potential issues, this could also be highlighted!

Visual Studio Online Content Pack

Another content pack that is about to be released is for people who use Visual Studio Online, I personally do not currently use this. But it does look great for people to once again have a great overview of what is going on.

You can read more about it here: Gain understanding and insights into projects in Visual Studio Online with Power BI

And as you can see below, what you can view once you have got it setup within Power BI.

Power BI planned updates

Below are the updates that I had previously voted for in Power BI. It is great to see that the Microsoft team is actively listening to their customers and implementing some of the idea’s. I have to say that I do not think that there are many other software companies that are doing this currently. And also being able to roll it out as quickly as Microsoft is.

Set Colors and Conditional Formatting in visuals

  • This is great as it will allow the report authors to have more control in terms of how their reports look.

Undo / Redo button in browser & designer

  • This might seem like a small update, but I know personally from working with the Power BI reports, that sometimes you just want to see what a different report looks like. Or adding another element. And with the Undo / Redo buttons, it just saves that little bit of time, as well as to make the report authoring experience that much more enjoyable.

Power BI Announcements from Microsoft Ignite

Below are some announcements that I have read up about either on Twitter or one of the blogs that I follow. It is really great to see so many things in the pipeline.

This means that there is a lot to look forward to, as well as ensuring that we have new and wonderful things to show.

I got this picture via Twitter, which someone must have taken at the Microsoft Ignite Conference. As you can see it is not very clear, but it does show the next update in the Power BI Designer.

You can also see the undo and redo buttons.

It also appears that in the Power BI service, there will be support for SSRS files, namely the .rdl files! Here is another picture taken from Microsoft Ignite.

 

Then there is the Many to Many relationships and bi directional cross filtering will be supported in SQL Server 2016 tabular models, which I am sure will also be included in the new Power BI backend. As this is where it stored all the data.

 

For Hybrid BI, there will be support for live querying for SSAS, currently this is already in place for SSAS Tabular.

 

It also looks like there will be a scheduled refresh for SQL Server Databases as a source. Which is great for people who either do not have either of the SSAS cubes, or want to get some of their data into Power BI.

Microsoft Azure Stack

While this is not exactly BI, it is related to BI, in that with Azure Stack you can get the Azure functionality on your own hardware which is fantastic. And I am sure for a lot of businesses this will be welcomed.

You can read more about it here: Microsoft Brings the Next Generation of Hybrid Cloud – Azure to Your Datacenter

 

Sharepoint 2013 – refreshing excel workbook with direct connection to sql server analysis services (SSAS) cube

I have not blogged in a while, due to moving countries and starting a new job. So i do hope that this blog will help someone or let them know how easy it is to use SharePoint 2013 to refresh data in an Excel spreadsheet.

What I needed to do, was to use an existing Excel spreadsheet which connected directly to an SSAS cube. We then wanted SharePoint to manage the refreshing of the data. In the past I thought that this was only applicable to Power Pivot Excel workbooks, but after today I realized that you can do this directly to your SSAS cube.

 

Getting the location of where you will store your Data Connections in SharePoint

The first thing that you need to do, is to ensure that you have the location of where you want to store your Connection File in your Excel workbook.

You will also need to have the Data Connections created in your SharePoint site.

 

Example:

·         In our Example we are going to be connecting our Existing Excel Workbook, to a SQL Server Analysis Services (SSAS) cube, using a data connection that is stored within SharePoint.

·         Once we have created our connection, we are then going to use the PowerPivot Refresh within SharePoint to refresh our Excel Workbook from the cube.

 

Assumptions:

·         This is based on SharePoint 2013 Enterprise Edition and SSAS 2012

·         We are going to assume that you have already got your SharePoint site set up.

·         We are also going to assume that you have already created your Excel Workbook, which connects to a cube.

·         We are also going to assume that you have either created a Documents Library, or are going to use an existing Documents Library.

·         And then you have uploaded your Excel Workbook to your documents library.

 

 

NOTE: You will be required to have Owner rights to do the following below within your SharePoint site.

 

1.       Log into your SharePoint site and click on Site Settings

2.       Then click on Data Connections

a.       clip_image001

3.       Once this opens you will need to copy everything before the /Forms/AllItems.aspx

4.       As with our example we copied the following:

http://SharePointBIWebSite/sites/wcsa/Data%20Connections

5.       Now either save the above link or copy the link which will be used in the next steps.

 

Changing our Excel File to use the stored connection within SharePoint

1.       Open your Excel File from your SharePoint location

a.       NOTE: The easiest way is to navigate to where you have uploaded your Excel file and then say Open in Windows Explorer

2.       Then open it in Excel

3.       Then click on Data, and click on Connections

a.       clip_image002

b.      Then click on Properties

4.       Once the Connection Properties Window opens click on Definition

5.       Next what you need to do is where it says Connection Name, change this to something more meaningful and possibly shorter than the default.

a.       We changed ours to the following name:

b.      clip_image003

6.       Now at the bottom where it says Export Connection File click on the button

a.       clip_image004

7.       This will then open the File Save Window

a.       Now at the top where it asks you the location of where you want to save the file click on the Drop down in the Address Bar and put in the URL which we either saved or Copied in Step 4  above and paste it:

                                                               i.      clip_image005

                                                             ii.      Now where it says File Name you can either leave this with the default, but what I recommend is changing it to then match your Connection Name, as we did with our example:

1.       clip_image006

                                                            iii.      Then click Save

b.      This will then open the Web File Properties Window, where it asks for some more information.

                                                               i.      Once again we checked to ensure that our Title Matched our Connection File Name

                                                             ii.      clip_image007

c.       Then click Ok.

8.       Now when you go back to your Connection Properties Window you will now see that your connection File has changed to the location of our ODC which is saved to your SharePoint data connections site.

9.       The next thing that you need to do is to make sure you put a tick in the box, “Always use connection File

a.       NOTE: This is so that whenever and where ever the Excel spreadsheet is used it will always use this connection file

b.      NOTE 2: This is so that when it is uploaded or run from SharePoint it will then use the associated ODC file.

c.       clip_image008

10.   The next thing that you need to do, is to configure the Excel Services Connection.

a.       NOTE: This is required as part of SharePoint so that it can use an Excel Services connection to make the authentication to the SSAS Cube to actually refresh the data.

b.      NOTE 2: You will have to ensure that the person responsible for your SharePoint installation has configured the Secure Store Service (SSS), as well as that the domain account that is linked to the SSS has access to the SSAS cubes.

c.       Click on the Authentication settings.

                                                               i.      Now in the Excel Service Authentication Settings screen configure it with the following below

1.       clip_image009

2.       NOTE: The name of our SSS ID is ExcelDataConnection

                                                             ii.      Then click Ok.

11.   Now when you click OK it is going to go and refresh all your sheets within your current Excel Workbook.

12.   You can now save and close your Excel Workbook.

 

Configuring your Data Refresh in SharePoint for your Excel Workbook

In the steps below we will now configure our Excel Workbook to have a scheduled refresh, so that when people open it up it will have the latest data based on the refresh.

 

1.       Now go to SharePoint where you have got your Excel spreadsheet uploaded.

2.       Click on the Open Menu Ellipses, then the Ellipses again, and finally Manage PowerPivot Data Refresh

a.       clip_image010

3.       This will then open the Manage Data Refresh web page.

a.       You can then configure it with the following:

b.      Under Data Refresh, you must Enable it.

                                                               i.      clip_image011

c.       You can then select  your Schedule Details based on your requirements

                                                               i.      NOTE: If you want to test this now, you must select the Also refresh as soon as possible tick box

                                                             ii.      clip_image013

d.      For the Earliest Start time select when you want the data to be refreshed.

                                                               i.      clip_image014

e.      For the E-mail notifications, this is for people you want to notify if the refresh fails.

                                                               i.      clip_image015

f.        Under Credentials, this is where you MUST specify the same SSS that you configured in your Excel Authentication settings

                                                               i.      As with our example we put in the following:

                                                             ii.      clip_image016

g.       Then finally for the Data Source, it should be configured with your Existing Data source you configured earlier

                                                               i.      clip_image017

h.      Then click Ok.

4.       Now if you selected step 3c above, you can wait a few minutes and see if it worked by going back into Manage PowerPivot Data Refresh and you should see the following:

a.       clip_image018

5.       Now you have completed the data refresh when connecting to an SSAS Cube via SharePoint

SQL Server Analysis Services (SSAS) – Updating Project with Partition information

I am sure that this has happened to someone else before. You are making a change to your SSAS cube, within your SSAS cube you have created your initial partitions. But on your production server you have programmatically added additional partitions. Now by mistake or just not thinking you deploy your project, and when it prompts to overwrite your current database, you click YES.

 

Now your production SSAS cube has all the wrong partitions. SO then you have to go about creating them again and processing them again.

 

So below are the steps that I do, before I make changes to my SSAS project, so that if I happen to deploy it by mistake I will not have to recreate the partitions. You will still have to process them again, but it does save the hassle of having to re-create them all.

 

Example:

·         Our current Internet Sales Partition has the following partitions created on our Production Server

o    clip_image002

·         We are going to manually create a new Partition called:

o    Internet_Sales_2009

·         Then we are going to go through the manual steps to get this partition information into our existing SSAS Project.

o    So what when we are finished we will see our Internet_Sales_2009 Partition within our SSAS Project.

o    Currently the Project looks like this:

o    clip_image004

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

NOTE: We are using SQL Server 2014, and SSDT for Visual Studio 2013

 

Creating new Partition on Server

1.       What we did was to script out our current partition and then modify it to create one for the year 2009

2.       Below is a snippet of where we made the changes

a.        clip_image006

3.       Once we ran this we then could see our Partition for the year 2009

a.        clip_image008

 

Creating new SSAS Project and importing SSAS Database

In the steps below we are going to create a new SSAS Project and then import our SSAS database into our Project.

 

1.       Within SSDT we are going to create a new Project with the following:

a.        clip_image010

2.       Give your project a name.

a.        As with our example we gave it the name of Adventure Works – Production Import

3.       This will then start the Import Analysis Services Database Wizard

a.        Click Next on the first screen

4.       On the Source Database screen put in the details to your Server and select your database as with our example shown below:

a.        clip_image012

b.       Click Next

5.       This will then import everything from your server.

6.       And once complete it will look like the following below:

a.        clip_image014

b.       Click Finish

7.       If you now go to our Adventure Works Cube, click on Partitions you should see the following under the Internet Sales Measure Group

a.        clip_image016

8.       When it first loads it does not update the Adventure Works.partitions file

9.       You need to do the following to put the XML data into the Adventure Works.partitions file

a.        Click on Build and then Build Adventure Works – Production Import

b.       Once this is done you will then see that your Adventure Works.cube has an asterix and needs to be saved:

c.        clip_image018

d.       Click Save.

10.    Now you can verify that your Adventure Works.partition file has the information within the file by its file size:

a.        clip_image020

11.    Now you can close this project down.

 

Changing the Partition information on our current SSAS Project

What we are going to do below is to now take the information from our project we created above (Adventure Works – Production Import) and put swop out the partition file so that when we open up our current SSAS Project it will then reflect the additional partition, (Internet_Sales_2009)

 

1.       Go to the location where your current SSAS Project is.

2.       Then make sure you go into the details where you can actually see all your project files.

3.       IN our example it would be in the following location:

a.       C:\Users\DomainUser\My Documents\Projects\Adventure Works DW 2012\Adventure Works DW 2012

4.       And it will look like the following:

a.        clip_image022

b.       NOTE: You will see above the partition information stored in the Adventure Works.partitions

c.        NOTE II: Every cube that you create will always have a .partitions file, even if you have not created any partitions

5.       Now rename your Adventure Works.partitions file to Adventure Works.partitions.Backup_20140723

a.        NOTE: This is so that we know when we made the change.

b.       It will now look like the following:

c.        clip_image024

6.       Now go the location where you created your Import project (Adventure Works – Production Import)

7.       In our example it would be in the following location:

a.       C:\Users\DomainUser\My Documents\Projects Adventure Works – Production Import\Adventure Works – Production Import

b.       In this folder copy the Adventure Works.partitions file

c.        NOTE: You will see it should be larger than our screenshot in step 4 above:

d.       clip_image026

8.       Now go back to your folder location of your current SSAS Project. (which we have in step 3 above)

a.        Then paste the Adventure Works.partition file into the folder.

b.       NOTE: You should be able to paste it without any issues due to renaming the current partition file in step 5

9.       Now open your current SSAS Project and see when you go into the Adventure Works.Cube and go to Partitions if you can see the new partition.

a.        clip_image028

 

Now if by mistake you do deploy your project at least the cube information is up to date.

SSIS – Dropping Partitions in SQL Server Analysis Services (SSAS) with XMLA and Analysis Services DDL

Below are the steps that we have integrated into SSAS using SSIS so that we can then drop our old SSAS Partitions using SSIS and XMLA.

 

Example:

·         We are going to drop our oldest partition from Measure Group called Fact InternetSales 1, which is in our Adventure Works cube.

·         The actual Cube partition name is called:

o    Internet_Sales_2005

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

 

Getting SSAS Partition Details

You can reference the following section in Analysis Services Quick Wins and go to the following section which will explain how to get down and insert our SSAS Partition details.

                SSAS (SQL SERVER ANALYSIS SERVICES) – GETTING ALL PARTITION INFORMATION FROM SSAS DATABASE

 

Getting Partition Name into Variable in SSIS

Below is how we will then drop our oldest SSAS Partition as per our example above.

 

1.       The first thing that we need to do is to find out our oldest Partition for our Measure Group called:

a.       Fact Internet Sales 1

2.       Once we have this we are then going to put this into a query, which we will then put into a variable in SSIS

3.       This is the query that we are going to use

Selecttop 1 ID as CubePartitionID

  FROM [AdventureWorksDW2012].[dbo].[Mart_TD_SSAS_PartitionDetails] with (nolock)

  Where [Parent_MeasureGroupID] =‘Fact Internet Sales 1’

a.        NOTE: For your particular Partition scheme you might have to change your query to get back the partition ID you expect

b.       As per above we get the following result in SQL Server Management studio (SSMS)

c.        clip_image002[4]

4.       Next in SSIS we will create the following variable as shown below:

a.        clip_image004[4]

5.       Next what we do is to assign our query from step 3 above to our variable.

a.        NOTE: Ideally you would want to put your query into a Stored Procedure.

b.       We drag in an Execute SQL Task, and we rename it to the following:

                                                               i.      clip_image006[4]

c.        Then we configured the General Tab with the following

                                                               i.      clip_image008[4]

d.       Then click on the Result Set on the left hand side and configure it with the following Result Name and Variable Name as shown below:

                                                               i.      clip_image010[4]

e.       Then click Ok.

f.         Now you can run and test to make sure that you can get your variable correctly.

 

Getting XMLA to drop SSAS Partition and put into SSIS

In the following steps we will generate our XMLA and then using this put it into SSIS so that we can then automate this.

 

1.       Go into SSMS and go to your SSAS Cube.

2.       As with our example this was our Adventure works cube.

3.       We then navigated to the following as shown below:

a.         clip_image012[4]

4.       Now right click on the Internet_Sales_2005, Script Partition as, Delete To, New Query Editor Window as shown below.

a.        clip_image014[4]

5.       You will now see the following in SSMS

a.        clip_image016[4]

6.       Next we now need to go into SSIS and create the following variable

a.        clip_image018[4]

7.       Next we need to take our XMLA delete statement and put this into a TSQL Query syntax so that we can then use this to populate our variable (XMLAQuery_DropSSASPartition)

8.       This is the how we did it:

a.        The first thing to do is where you have your script from step 5 above, go into find and replace and do the following:

                                                               i.      clip_image020[4]

b.       NOTE: This is so that when we put this into SSIS and load it as an expression it will not invalidate it due to the double quotation.

c.        Now we put in our TSQL Query syntax as shown below

image

9.       Now go into your SSIS package and next to the variable XMLAQuery_DropSSASPartition click on the Ellipses button

a.        Now we configured our expression with the following shown below:

image

b.       As you can see above we have encapsulated this in our SSIS Expression

c.        What we have also done is to insert our CubePartitionID variable into our expression

                                                               i.      It is highlighted in RED above.

d.       You can click on Evaluate Expression to ensure that everything is correct.

e.       Click Ok to insert our expression

10.    Next what we need to do is to assign our variable we created above into a variable so that this can then be passed to an Analysis Services Execute DDL Task to actually drop the partition, but doing the following below:

a.        First we need to create a variable which will hold our XMLA syntax once it has been populated from step 9 above.

b.       We gave it the following name:

c.        clip_image022[4]

d.       Next what we then need to do is to use an Execute SQL Task to then get our XMLA script populated.

e.       We dragged in an Execute SQL Task and gave it the following name:

                                                               i.      clip_image024[4]

f.         Then we configured our Execute SQL Task with the following to get our data from our variable in step 9 above.

                                                               i.      clip_image026[4]

                                                              ii.      NOTE: AS you can see we set the SQLSourceType to a variable.

1.       And then used our variable name from step 9

g.        Then click on Result Set and we configured it with the following:

                                                               i.      clip_image028[4]

h.       Then click Ok.

11.    Now the final part is to drag in our Analysis Services Execute DDL Task and configure it to connect to our cube, and then use our script from step 10 above.

12.    Now we just need to configure the Analysis Services Execute DDL Task by doing the following.

a.        Drag in the Analysis Services Execute DDL Task, double click to go into the Properties

                                                               i.      Click on DDL

b.       In the right hand side where it says Connection you are going to have to create a connection to your SSAS, and Database

                                                               i.      Click on New Connection

c.        As with our example we created our connection, which you can configure to your SSAS Cube.

d.       Now where it says SourceType, click on the Drop down and select Variable

e.       Then next to Source click on the Drop down and select the Variable that we populated in the previous steps.

f.         As with our example we selected the following:

                                                               i.      User::XMLAScript_DropSSASPartition

g.        It should look like the following:

h.       clip_image030[4]

i.         Then click Ok.

Order of Control Flow Items

The final step is to now order everything correctly in our Control Flow.

 

1.       Below is how we have ordered our SSIS Package

2.       clip_image032[4]

3.       NOTE: You will see that we are truncating our Mart_TD_SSAS_PartitionDetails table.

a.        This is because we want to keep it up to date.

4.       NOTE 2: You will see that even though we started with how to get our PartitionDetails, we still put this at the end, so that once our SSAS Partition has been dropped we have the correct details.

5.       Finally run your SSIS package and it will then drop your last partition

SSAS (SQL Server Analysis Services) – Getting all partition information from SSAS Database

I had a requirement where I wanted to find out and keep a constant record of exactly how my partitions were set up and created within SQL Server Analysis Services (SSAS). So below is a script that I found somewhere (If I find the source I will put it in hereJ) and how I inserted into a SQL Server Table so that I can use it for the creation and dropping of SSAS Partitions.

 

Example: We are going to be getting back all our SSAS Partition information from our AdventureWorksDW2012 cube.

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Installation of ASSP for Analysis Services

1.       If you have not got this installed already you would first need to install the ASSP for Analysis services.

2.       You can get the files from here:

https://asstoredprocedures.codeplex.com/releases/view/79180

3.       And then in order to install you can follow these easy to install instructions:

https://asstoredprocedures.codeplex.com/wikipage?title=Installation%20Instructions

 

Configuration andgetting down partitioned data

 

1.       The first thing that we did was to check to ensure that when running our MDX query it would return the required results from our cube.

2.       In SQL Server Management Studio (SSMS) we went into our Analysis Services, then went to our AdventureWorksDW2012 cube.

a.       Then right click, select New Query and then MDX

b.      clip_image001

c.       Once this opens run the following query:

call assp.DiscoverXmlMetadata(“Partition”)

d.      Now execute the query and you should see the following as shown below:

e.      clip_image002

f.        NOTE: There are a whole host more columns with a lot of valuable information.

                                                               i.      Below is often what I am most interested in, and the columns are on the right hand side

g.       clip_image003

3.       Then in order to store this information we are then going to use SSIS to Insert data from an MDX query into a SQL Server table.

a.       You can use this blog post below

b.      SSIS 2012 – INSERTING DATA INTO A SQL SERVER TABLE FROM AN MDX QUERY

4.       Then we created our table with the following syntax in order to get the data into our SQL Server Table as our destination.

CREATETABLE [Mart].[TD_SSAS_PartitionDetails](

       [Name] [varchar](300)NULL,

       [ID] [varchar](300)NULL,

       [CreatedTimeStamp] [varchar](300)NULL,

       [LastSchemaUpdate] [varchar](300)NULL,

       [Description] [varchar](300)NULL,

       [LastProcessed] [varchar](300)NULL,

       [State] [varchar](300)NULL,

       [Type] [varchar](300)NULL,

       [AggregationPrefix] [varchar](300)NULL,

       [StorageMode] [varchar](300)NULL,

       [CurrentStorageMode] [varchar](300)NULL,

       [StringStoresCompatibilityLevel] [varchar](300)NULL,

       [CurrentStringStoresCompatibilityLevel] [varchar](300)NULL,

       [ProcessingMode] [varchar](300)NULL,

       [ProcessingPriority] [varchar](300)NULL,

       [StorageLocation] [varchar](300)NULL,

       [RemoteDataSourceID] [varchar](300)NULL,

       [Slice] [varchar](300)NULL,

       [EstimatedRows] [varchar](300)NULL,

       [AggregationDesignID] [varchar](300)NULL,

       [EstimatedSize] [varchar](300)NULL,

       [Parent_MeasureGroupID] [varchar](300)NULL,

       [Parent_CubeID] [varchar](300)NULL,

       [Parent_DatabaseID] [varchar](300)NULL

)ON [PRIMARY]

 

GO

a.        Once completed our SSIS Data Flow task looked like the following:

b.       clip_image004

5.       Now once we run the above SSIS package we see our data in our SQL Server table from step 4 above.

6.       NOTE: If you want to keep a historical record you could then take this data and put it into a Fact style table.

a.        But for our purposes we just truncated the above table and inserted our data again daily.

 

Usage for SSAS Partition Details

·         The main reason for us getting our SSAS Partition details into a SQL Server table, is so that we could then get a list of our current partitions for our SSAS database and cube.

·         We could then use our Partition details to find out or Max and Min Partitions, and also how many partitions we have.

·         Another use is based on our Max Partition is to ensure that we can create partitions for our data that we are going to load going forward.

·         Likewise we could also find out our Min Partition to drop older partitions of data.

·         And finally because all our data is stored in a SQL Server Table, we could then use this data within SSIS using XMLA to dynamically create partitions.

o    You can use this blog post below as a reference:

SSIS – CREATING NEW PARTITIONS IN SQL SERVER ANALYSIS SERVICES (SSAS) WITH XMLA AND ANALYSIS SERVICES DDL

SSIS – Creating new Partitions in SQL Server Analysis Services (SSAS) with XMLA and Analysis Services DDL

What I wanted to do is to create a process in SSIS that did not use code to create new SSAS Partitions using XMLA.

 

NOTE: This is quite a long blog post, but it has all the details from start to finish to get this completed.

 

Example:

·         We will be using the AdventureWorks2012DW data warehouse and Cube.

·         We will be creating new Partitions starting from 01 Jan 2010 on our Internet Sales Measure Group.

o    When creating our Partitions we will be using the query binding because our underlying SQL Server Table will be partitioned by DateKey.

§  NOTE: Our SQL Server Table would also have a Clustered Index created on the DateKey.

o    For our query in Analysis Services Partition we will be using the first day and the last day of the month in order to get our required data into our Partition.

o    Along with this when we create our Partition we will also use the Slice Function to further help Analysis Services query the data as quickly as possible.

o    So our Start Datekey will be 20100101

o    And our End DateKey will be 20100131

·         Our Partitioning naming scheme will be the Measure group name and year and month

o    So for our example it will be Internet Sales 2010-01

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Creating our Partitioned Table and starting point.

Below are the steps where we will create our table and our starting point for our example above.

 

1.       Below is the script that we will use on our AdventureWorksDW2012 SQL Server database which will be explained below:

— =======================================================================

— 1. This will create our Table

— =======================================================================

CREATETABLE[dbo].[Mart_TD_Max_SSAS_PartitionDetails](

       [YearOfLastPartition][int]NOTNULL,

       [MonthNameOfLastPartition][varchar](30)NOTNULL,

       [YearMonthOfLastPartition][varchar](30)NOTNULL,

       [DateTimeInsertedIntoTable][datetime]NOTNULL,

       [PartitionName][varchar](300)NOTNULL,

       [SSAS_DatabaseName][varchar](300)NOTNULL

)ON[PRIMARY]

 

GO

 

— =======================================================================

— 2. Then we insert our starting point into our Partitioned Table

— =======================================================================

Insertinto[dbo].[Mart_TD_Max_SSAS_PartitionDetails]

Select 2009 as[YearOfLastPartition]

      ,‘December’as[MonthNameOfLastPartition]

      ,‘2009-12’as[YearMonthOfLastPartition]

      ,getdate()as[DateTimeInsertedIntoTable]

      ,‘Internet Sales – 2009-12’[PartitionName]

      ,‘Adventure Works’as[SSAS_DatabaseName]

 

— =======================================================================

— 3. Then we do a select to ensure that the details inserted are correct

— =======================================================================

Select*

from[dbo].[Mart_TD_Max_SSAS_PartitionDetails]

a.        As you can see above in Section 1 is where we are creating our table.

b.       In Section 2 is where we are inserting the data which is easier to explain:

                                                               i.      clip_image001

                                                              ii.      NOTE: We will be using the above columns to generate our requirements for when creating our next partition.

1.       Which you can view here in section later in the document.

                                                            iii.      As you can see above because we want our next partition to be for Jan 2010, we need to start our details one month earlier. EG; December 2009

 

Getting details from a stored procedure to be used in SSIS to create our next partition

In the steps below we are going to show the stored procedure and explain how we get our details for our next partition.

We will then use these details and in the next step put this all into SSIS to create our partition.

 

1.       Below is the entire script and each section is explained in the comments section

a.        NOTE: A quick overview is that based on the information in the Mart_TD_Max_SSAS_PartitionDetails, we are getting the next month’s values for the following:

                                                               i.      Our Start and End DateKeys, Year Month for the Partition Name, as well as the year and Month name which are used in the proc.

b.       clip_image002

USE[AdventureWorksDW2012]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_GetNextPartitionValues_ForSSAS_SQL]    Script Date: 2014-04-09 10:46:59 AM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— ===============================================================================================================

— Disclaimer

— The sample scripts are provided AS IS without warranty of any kind.

— I further disclaims all implied warranties including, without limitation, any implied warranties of

— merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance

— of the sample scripts and documentation remains with you.

— In no event shall I, its authors, or anyone else involved in the creation, production, or delivery of

— the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business

— profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use

— of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the

— possibility of such damages.

— ===============================================================================================================

 

— ===============================================================================================================

— Author:       Gilbertq

— Create date: 09 Apr 2014

— Description:  This is where we get the data from the Current or Last Partition and then get into variables

—              the next partition values to create the SSAS Partitions and SQL Partitions if required in SSIS

— ===============================================================================================================

CREATEPROCEDURE[dbo].[prc_GetNextPartitionValues_ForSSAS_SQL]

 

AS

BEGIN

 

   SETNOCOUNTON;

 

— ==================================================================================================================

— 1. We are getting the Year Value of the last partition that was created from Mart_TD_Max_SSAS_PartitionDetails

— ==================================================================================================================

Declare@YearOfLastPartitionasInt

Set@YearOfLastPartition=

   (

          SelectYearOfLastPartition

          from[dbo].[Mart_TD_Max_SSAS_PartitionDetails]with (nolock)

   )

 

— ====================================================================================================================

— 2. We are getting the Month Name Value of the last partition that was created from Mart_TD_Max_SSAS_PartitionDetails

— ====================================================================================================================

Declare@MonthNameOfLastPartitionasvarchar(30)

Set@MonthNameOfLastPartition=

   (

          SelectMonthNameOfLastPartition

          from[dbo].[Mart_TD_Max_SSAS_PartitionDetails]with (nolock)

   )

 

— ====================================================================================================================

— 3. Here we are getting the Actual Date from our previous values so that it can be used in the next steps.

—    Below we are always getting the last day of the month for our current values.

— ====================================================================================================================

Declare@MaxDateForCurrentPartitionasDate

Set@MaxDateForCurrentPartition=

   (

          Selecttop 1 [FullDateAlternateKey]

          fromDimDatewith (nolock)

          whereCalendarYear=@YearOfLastPartition

          and[EnglishMonthName]=@MonthNameOfLastPartition

          orderby[DayNumberOfMonth]desc

   )

 

— ====================================================================================================================

— 4. Now here we are getting the next date value, which will be the first day of the Next Month

— ====================================================================================================================

Declare@MaxDateForNextPartitionasint

Set@MaxDateForNextPartition=

   (

          SelectDateKey

          fromDimDatewith (nolock)

          where[FullDateAlternateKey]=DATEADD(Day,1,@MaxDateForCurrentPartition)

   )

 

— ====================================================================================================================

— 5. Next we are getting the Year value for our Next Month’s value

— ====================================================================================================================

Declare@YearForNextPartitionasint

Set@YearForNextPartition=

   (

          SelectCalendarYear

          fromDimDatewith (nolock)

          whereDateKey=@MaxDateForNextPartition

   )

 

— ====================================================================================================================

— 6. Next we are getting the Month Name value for our Next Month’s value

— ====================================================================================================================

Declare@MonthNameForNextPartitionasvarchar(30)

Set@MonthNameForNextPartition=

   (

          Select[EnglishMonthName]

          fromDimDatewith (nolock)

          whereDateKey=@MaxDateForNextPartition

   )

 

— ====================================================================================================================

— 7. Next we are getting the first day of Next Month’s value

— ====================================================================================================================

Declare@StartDateKeyForNextPartitionasInt

Set@StartDateKeyForNextPartition=

   (     

          Selecttop 1 DateKey

          fromDimDatewith (nolock)

          whereCalendarYear=@YearForNextPartition

          and[EnglishMonthName]=@MonthNameForNextPartition

          orderby[DayNumberOfMonth]

   )

 

— ====================================================================================================================

— 8. Next we are getting the last day of Next Month’s value

— ====================================================================================================================

Declare@EndDateKeyForNextPartitionasINt

Set@EndDateKeyForNextPartition=

   (

          Selecttop 1 DateKey

          fromDimDatewith (nolock)

          whereCalendarYear=@YearForNextPartition

          and[EnglishMonthName]=@MonthNameForNextPartition

          orderby[DayNumberOfMonth]desc

   )

 

— ====================================================================================================================

— 9. Next we are getting the Year Month Value for Next Month’s Partition

— ====================================================================================================================

Declare@YearMonthForNextPartitionasvarchar(30)

Set@YearMonthForNextPartition=

   (

          Selecttop 1 convert(Varchar(4),CalendarYear)+‘-‘+RIGHT(REPLICATE(‘0’,5)+CONVERT(VARCHAR(2),MonthnumberofYear),2)

          fromDimDatewith (nolock)

          whereCalendarYear=@YearForNextPartition

          and[EnglishMonthName]=@MonthNameForNextPartition

   )

 

— ====================================================================================================================

— 12. And finally we are getting our 3 values for Next month which we can then use in our SSIS package for multiple

—     loads

— ====================================================================================================================

Select

           @StartDateKeyForNextPartitionasStartDateKeyForNextPartition

          ,@EndDateKeyForNextPartitionasEndDateKeyForNextPartition

          ,@YearMonthForNextPartitionasYearMonthForNextPartition

          ,@YearForNextPartitionasYearForNextPartition

          ,@MonthNameForNextPartitionasMonthNameForNextPartition

 

 

 

         

   SETNOCOUNTOFF;

END

 

GO

2.       Run this against your SQL Server AdventureWorksDW2012 database.

 

SSIS to create new Partitions for SSAS

In the steps below we will use SSIS to create our new Partitions for our SSAS Cube.

 

1.       The first thing to do is to create the following variables which will be used for our entire SSIS Package:

a.        clip_image003

b.       NOTE: The reason that I set them all to the Data type of string is so that when they are used later in the expressions we do not have to cast the values.

                                                               i.      This is because all our expressions will either be passed to SQL server or Analysis Services which only accepts it in a string format.

                                                              ii.      We also put in defaults so that when we Evaluate the Expressions later we can see values.

2.       Next we are going to get all our variables for our stored procedure above and put them into the above variables.

a.        Drag in your Execute SQL task and as with our example we gave it the following name:

Get Variables for Next Months SSAS Partition for Internet Sales

b.       Next right click and select Properties.

c.        You will have to create your OLE DB connection to your SQL server where you have your AdventureWorksDW2012 database.

                                                               i.      clip_image004

d.       Then configure the General window with the following as shown below:

                                                               i.      clip_image005

e.       Then click on the Result Set on the left hand side and configure it with the following to map our variables from our Stored Procedure to our Variables in SSIS

                                                               i.      clip_image006

                                                              ii.      NOTE: All of the above will be used in different sections throughout the SSIS Package.

f.        Click Ok.

g.        Then right click and select Execute Task, to test that it runs correctly.

                                                               i.      You should see the following below:

                                                              ii.      clip_image007

3.       Now in the following steps I am going to explain how initially to manually create our Partition so that we can then use this for our SSIS in the next step.

a.        Log into your SSAS Server and go into the Measure Group and then Partition where you want to automate your partition creating.

b.       NOTE: As with our example we are going to create a new Partition on the Internet Sales Measure Group for December 2009

                                                               i.      clip_image008

c.        Right click on Partitions and select New Partition, this will open the Partition Wizard.

                                                               i.      We selected the Internet Sales Facts table as shown below and clicked Next

1.       clip_image009

                                                              ii.      On the Restrict Rows we selected Specify a query to restrict rows

1.       NOTE: We are doing this due to wanting to specify and our start and end DateKey

2.       clip_image010

3.       We then scrolled to the end of the Query and put in the following as per our example

a.        clip_image011

                                                            iii.      We accepted the defaults until we got to the final window. And put in the details below.

1.       clip_image012

                                                            iv.      We then clicked Finish.

d.       You will now see the partition we created

                                                               i.      clip_image013

4.       Next is where we are going to script out our Partition so that we can then use this within SSIS

a.        Right click on the partition we created above and select Script Partition as, Create To, New Query Editor Window.

                                                               i.      clip_image014

                                                              ii.      You should see the following below which is not the complete script

1.       clip_image016

b.       Next you will first have to do a find and replace on all the double quotation so that it can be escaped in SSIS.

                                                               i.      Press Control+H and complete the following below:

1.       clip_image017

                                                              ii.      Then click Replace All.

c.        If you have any singles quotes in your Query Definition you will also have to change them to have 2 single quotes so that when this is parsed by TSQL it will work.

                                                               i.      So as with our above example we also had to change the following from:

1.  + ‘Line ‘  +

2.       To: + ”Line ”  +

d.       Now open a TSQL Query and first put in the following:

SelectasXMLAScript_CreateSSASPartition

                                                               i.      Then take your XMLA script from above and insert it in between the single quotes above.

                                                              ii.      Now run the TSQL Select to ensure that it will run successfully

1.       clip_image018

                                                            iii.      NOTE: The reason that we parse this in TSQL is so that in SSIS we can then put in our variables as required.

e.       Now take the entire TSQL statement from step 4c above and copy it.

f.         Now go back into SSIS and open your variables.

                                                               i.      Where it has the variable name of XMLAQuery_CreateSSASPartition click on the Ellipses under Expression

                                                              ii.      Then in the Expression Window put in a double quote at the start and insert your TSQL query from step 4d above:

1.       clip_image019

                                                            iii.      Then scroll right to the end and put in another double quote at the end

1.       clip_image020

                                                            iv.      Then click on Evaluate Expression to ensure that so far it is all correct.

g.        Now the final part in creating our script is to put in the required variables.

                                                               i.      The first section where we are going to add the variables is for the ID and name of our partition.

                                                              ii.      As with our example we put in the following:

1.       What it looked like before the change:

<ID>Fact Internet Sales 2009-12</ID>

<Name>Fact Internet Sales 2009-12</Name>

2.       And what it was with the variables inserted:

<ID>Fact Internet Sales “+ @[User::YearMonthForNextPartition]  +”</ID>

<Name>Fact Internet Sales “+ @[User::YearMonthForNextPartition]  +”</Name>

                                                            iii.      In the next section we are going to put in our Start and end DateKey’s for our Query Definition

1.       What it looked like before the change:

WHERE orderdatekey between 20091201 and 20091231</QueryDefinition>

2.       And what it looked like after the change:

WHERE orderdatekey between “+ @[User::StartDateKeyForNextPartition]  +” and “+ @[User::EndDateKeyForNextPartition]  +”</QueryDefinition>

                                                            iv.      Now what we are going to add is our Slice Property to our Partition.

1.       So after the section below is where you will put in the Slice Property

<ProcessingMode>Regular</ProcessingMode>

2.       As you can see we have put in the Slice Property with the variables already inserted:

<ProcessingMode>Regular</ProcessingMode>

<Slice>{[Date].[Date].&amp;[“+ @[User::StartDateKeyForNextPartition]  +”],[Date].[Date].&amp;[“+ @[User::EndDateKeyForNextPartition]  +”]}</Slice>

                                                              v.      Now click on Evaluate Expression to ensure that everything is correct.

1.       If you scroll down you should see the following

a.        clip_image021

b.       NOTE: This was due to us creating the values for our variables in Step 1 above.

5.       Next we are going to be taking our script which has been populated with the details above and put this into a variable which can then be passed to our Analysis Services Execute DDL Task.

a.        Drag in an Execute SQL Task.

b.       We then gave it the following name:

                                                               i.      Get XMLA for SSAS to create new Partition

c.        Next right click and go into the Properties and configure it with the following as shown below:

                                                               i.      clip_image022

                                                              ii.      NOTE: As you can see above we are using the variable which we populated with our XMLA script in step 4 above.

                                                            iii.      We are also setting the Result Set to a Single row. This is so that we can populate our XMLA Script which will then be passed to Analysis Services.

1.       Just to ensure the understanding is that this Execute SQL Task will populate our XMLAQuery_CreateSSASPartition variable with the details.

2.       After which we want the output to be inserted into our XMLA Script which will then be used in Analysis Services.

d.       Now click on Result Set and configure it with the following below:

                                                               i.       Click On Add and select the following:

                                                              ii.      clip_image023

e.       Then Click Ok.

6.       Next we will need to configure our Analysis Services Execute DDL Task with the following steps below.

a.        Drag in the Analysis Services Execute DDL Task

b.       Right click and select Properties to go into the Properties.

c.        Under Name we gave it the following name as per our Example:

                                                               i.      Run XMLA Script to create new SSAS Partition

d.       Then click on DDL.

                                                               i.      In the right hand side where it says Connection you are going to have to create a connection to your SSAS, and Database

                                                              ii.      Click on New Connection

                                                            iii.      As with our example we created our connection

1.       clip_image024

                                                            iv.      Then Click Ok Twice to get back.

                                                              v.      Then next to Source click on the Drop down and select the Variable that we populated in the previous steps.

                                                            vi.      As with our Example we are going to select the variable that we populated in step 5 above.

                                                           vii.      So once complete it will look like the following:

1.       clip_image025

                                                         viii.      Then Click Ok.

7.       Next we need to create our Update TSQL Statement for our Mart_TD_Max_SSAS_PartitionDetails table so that we can dynamically update this after the partition above has been created.

a.        Go into your variables and click on the ellipses button next to Query_Update_Mart_TD_Max_SSASPartitionDetails

b.       Then we put in the following for our Update Statement in the Expression Window:

“Update [dbo].[Mart_TD_Max_SSAS_PartitionDetails]

 Set         [YearOfLastPartition] = “+ @[User::YearForNextPartition]  +”

                                ,[MonthNameOfLastPartition] = ‘”+ @[User::MonthNameForNextPartition]   +”‘

                                ,[YearMonthOfLastPartition] = ‘”+ @[User::YearMonthForNextPartition]  +”‘

                                ,[DateTimeInsertedIntoTable] = getdate()

                                ,PartitionName = ‘Internet Sales “+ @[User::YearMonthForNextPartition]  +”‘

                                ,SSAS_DatabaseName = ‘Adventure Works'”

                                                               i.      Then click on Evaluate Expression to ensure that it is valid and you should see the following:

1.       clip_image026

c.        Then click Ok.

8.       The next part is to update our Mart_TD_Max_SSAS_PartitionDetails with our last Partition that was created in the steps above. This is so that we have a starting point for the next time this runs.

a.        Drag in an Execute SQL Task and give it the following name:

                                                               i.      Update Mart_TD_Max_SSAS_PartitionDetails with Last Partitions created

b.       Then configure it with the following:

                                                               i.      clip_image027

                                                              ii.      NOTE: The variable that we selected is called: Query_Update_Mart_TD_Max_SSASPartitionDetails

c.        Then click Ok

9.       Now the final part is where if the create partition script fails to then send an email so that the person responsible is made aware of the error.

a.        Drag in your Send Mail Task and drag it under your Run XMLA Script to create new SSAS Partition

b.       Drag the precedence constraint to your Send Mail Task and change it from success to failure.

                                                               i.      clip_image028

c.        Then configure your Send Mail Task as you require for the person responsible.

10.    So once complete it looks like the following:

a.        clip_image029

11.    Now that it is complete you can run the package.

a.        So when we looked at our Partitions before we ran the package it looked like the following:

                                                               i.      clip_image030

                                                              ii.      NOTE: The reason for our Fact Internet Sales 2009-12 was so that we could create our script.

b.       Now after running the package it completes as shown below:

                                                               i.      clip_image031

c.        Now if we look at our Partitions again we will see the following after refreshing it:

                                                               i.      clip_image032

d.       And then finally if we look at our Mart_TD_Max_SSAS_PartitionDetails it should have the details of our Partition we created above:

                                                               i.      clip_image033

 

 

SSIS – Using SQL Server Analysis Services (SSAS) MDX query into Variable in Execute SQL Task

What I wanted to do was to get the output from my MDX query, and then use this in SSIS and take the output and put it into an SSIS Variable which could then be used later in your SSIS Package.

 

Example:

·         We are going to get the Internet Sales Amounts for 2 days.

o    01 June 2008 – $44,650.70

o    02 June 2008 – $35,496.03

·         Then we are going to put the two values into a variable.

·         Then we are going to use a Precedence constraint to compare the values, and if the Internet Sales amount is lower when compared then continue onto the next step.

o    So if the value on 02 June 2008 is lower than the value on 01 June 2008 continue to the next step.

o    So as from the above values it should continue onto the next step.

o    If the amount is higher then do not do anything.

·         NOTE: The reason that we are using an MDX query is so that we can get our data back as quickly as possible.

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

1.       The first thing that we are going to do below is get our MDX query which connects to the AdventureWorksDW2012 SSAS Cube.

a.        This query is for Internet Sales for 01 June 2008

Selectnonempty {[Measures].[Internet Sales Amount]} on 0

,nonempty {(

                     [Date].[Date].&[20080601]

                     )} on 1

from [Adventure Works]

                                                               i.       

b.       This query is for Internet Sales for 02 June 2008

Selectnonempty {[Measures].[Internet Sales Amount]} on 0

,nonempty {(

                     [Date].[Date].&[20080602]

                     )} on 1

from [Adventure Works]

                                                               i.       

2.       Now create your new SSIS Package.

3.       Next we will create our 2 variables that will be used to populate using the MDX queries above.

4.       Below is how we have created them:

a.        clip_image002[1]

b.       NOTE: We have created them as Int64 due to us getting back a numeric value.

                                                               i.      And this can also be compared in our Precedence Constraint

5.       Next we need to create an OLE DB Connection to our SSAS Cube by doing the following below:

a.        Right click in your Connection Managers section and select New OLE DB Connection

b.       When the window opens click on New

                                                               i.      Now from the drop down select the following:

1.       clip_image004[1]

2.       NOTE: You have to select the above so that we can connect to our SSAS Instance.

                                                              ii.      Now once configured you will see the following below:

1.       clip_image006[1]

                                                            iii.      If you want you can click on Test Connection to ensure that it can connect.

                                                            iv.      Then click Ok and then Ok again.

c.        You should now see the following in your Connection Managers Window:

                                                               i.      clip_image008[1]

                                                              ii.      NOTE: If required you can make this a Project Connection if your SSIS Project is in Project Deployment mode.

6.       Now what we are going to do is to assign our MDX Query to our variables using the Execute SQL Task.

7.       Drag in an Execute SQL Task and complete the following steps for One Day back

a.        As per our example I renamed the Execute SQL Task to the following:

Get Internet Sales Amount for One Day Back into Variable

b.       I went into the Execute SQL Task Properties and configured it with the following:

                                                               i.      clip_image010[1]

c.        We then selected our OLE DB connection that we created in step 5 above:

                                                               i.      clip_image012[1]

d.       Next under SQLStatement we are then using our MDX query from Step 1a above:

                                                               i.      clip_image014[1]

                                                              ii.      Then click Ok.

e.       Next click on the Result Set and click on Add

                                                               i.      Now where it says Result Name change this to the following:

[Measures].[Internet Sales Amount]

                                                              ii.      And then ensure that the Variable Name is:

User::InternetSales_OneDayBack

                                                            iii.      clip_image016[1]

f.         Then click Ok.

g.        Now right click and select Execute Task to test that the variable will be populated.

h.       clip_image018[1]

8.       Drag in an Execute SQL Task and complete the following steps for Two Days back

a.        As per our example I renamed the Execute SQL Task to the following:

Get Internet Sales Amount for Two Days Back into Variable

b.       I went into the Execute SQL Task Properties and configured it with the following:

                                                               i.      clip_image010[2]

c.        We then selected our OLE DB connection that we created in step 5 above:

                                                               i.      clip_image019[1]

d.       Next under SQLStatement we are then using our MDX query from Step 1b above:

                                                               i.      clip_image021[1]

                                                              ii.      Then click Ok.

e.       Next click on the Result Set and click on Add

                                                               i.      Now where it says Result Name change this to the following:

[Measures].[Internet Sales Amount]

                                                              ii.      And then ensure that the Variable Name is:

User::InternetSales_TwoDaysBack

                                                            iii.      clip_image023[1]

f.         Then click Ok.

g.        Now right click and select Execute Task to test that the variable will be populated.

h.       clip_image025[1]

9.       Now the final step is to have a next step to go to if the value is lower.

10.    For our example we are going to put the next step as a Send Mail Task

a.        So we dragged in the Send Mail Task and configured it so that it can send an email if run.

b.       Next we dragged the success precedence constraints as shown below:

c.        clip_image027[1]

d.       NOTE: The reason it was done in this order is so that we can get both variables populated before we change the constraint options.

e.       Now double click on the success precedence constraint that goes from Get Internet Sales Amount for Two Days Back into Variable to the Send Mail Task

                                                               i.      clip_image029[1]

f.         Now in the Precedence Constraint Editor Window complete the following:

                                                               i.      Change the Evaluation operation from Constraint to Expression

                                                              ii.      clip_image031[1]

g.        Then where it says Expression click on the Ellipses button.

h.       Now in order to check if the value is less to continue we put the following:

@[User::InternetSales_OneDayBack] <  @[User::InternetSales_TwoDaysBack]

                                                               i.      Click on Evaluate Expression to ensure that it is valid.

                                                              ii.      clip_image033[1]

i.         Then Click Ok and Ok again to go back to the Control Flow

11.    Now finally run your SSIS Package and it should complete with the following as shown below:

12.    clip_image035[1]

13.    NOTE: If you had to change your Precedence Constraint to the following below and re-run the above in order to test if the value was higher you would get the following result:

a.        clip_image037[1]