Automating running PowerShell script which required Administrator elevation via SSIS

UPDATE (08 AUG 2016): I have updated this to use a SQL Server Agent Job instead of the Task Scheduler which proved to not be reliable.

What we are going to explain below is the following via an example.

Example – We are required to run a PowerShell script which will snapshot a database using Xtreme IO from EMC. In order for this to run it has to be run as Administrator due to it having to drop and re-create the disk drives in the Operating System layer.

So in order to complete this we have to complete the following below but use a simple PowerShell script which will get all the FileNames from a Folder.

  • Create a BAT file which can execute the PowerShell script.
  • Create a Scheduled Task which in turn will call the BAT file.
  • Create an SSIS Package which will call the Scheduled Task

NOTE: The reason that we did this above, was to ensure that we could automate the snapshotting of the database using SSIS. This would enable us to get a snapshot of the data on a required schedule.

Changing the UAC

Below is a quick step to disable the UAC, as I think that this might affect the later steps when the BAT file is trying to run the PowerShell script.

NOTE: You will need to be an Administrator on the Server to turn this off.

  1. Go into the Control Panel.
  2. Then search for UAC.
  3. Then click on the Change User Account Control settings as shown below:
  4. Change it to Never Notify as shown below:
  5. Then click
    Ok
  6. It will ask you one last time if you want to make the change to the Computer.
    1. Click on Yes

Creating the BAT File which can execute the PowerShell script

In the steps below we are going to explain how we create the BAT file which executes the PowerShell script. And also show that at first we can run the PowerShell script manually to ensure that it works.

Testing the PowerShell Script

  1. The first thing to test is to ensure that the PowerShell script actually runs and completes as it should.
  2. Click on Start and then type in PowerShell and from the search results right click on Windows PowerShell ISE and select Run as Administrator
  3. Now when the PowerShell ISE opens you can see from the Window at the top that you are running as Administrator
  4. If you want to follow our example you can then copy and paste the following PowerShell script

    Get-ChildItem
    -Path
    ‘C:\Program Files\Microsoft SQL Server\MSSQL11.PRODBI\MSSQL’
    -Recurse
    |
    Export-Csv
    -Path
    ‘C:\Program Files\Program Files.csv’
    -Encoding
    ascii
    -NoTypeInformation

    1. What we are doing above is to get the File Names for all the files and folders under Program Files and then our SQL Server MSSQL Folder.
    2. And then export it to a CSV.
    3. As with our example once run you should see the CSV file as shown below:
  5. Now we know that our script runs we can now save it to a location on our Server.
  6. As with our example we saved it to a new folder on the C:Drive
  7. We will now use this in the next steps to create the BAT File.

Creating the BAT file which will run the PowerShell script.

In the steps below we will be creating the BAT file which will run the PowerShell script.

  1. Open up Notepad or Notepad++ and copy in the following, which will be explained below:
    1. c:\windows\system32\WindowsPowerShell\v1.0\powershell.exe
      -ExecutionPolicy ByPass -command
      “. ‘C:\XIOSnapshot\Export-MSSQL-Files.ps1′”
    2. What we are doing above is to first put in the full location of PowerShell highlighted in GREEN
    3. Next we are putting in some parameters for PowerShell, which is to bypass the execution policy and the command that is to be run highlighted in PURPLE
    4. And then the final part is where we are calling the PowerShell script which we created above. It is highlighted in RED
      1. NOTE: In order to run the command, we have to put it all in double quotes, as well as it has to have a period in order for it to be executed.
    5. Now we can save this BAT File in the same location with a name.
    6. As per our example we saved it with the following:
  2. Now what we will need to do is to run the BAT file from the command line to test that it all works.
  3. You can now open command prompt and run the bat file as we have shown with our example below:
    1. NOTE: You can see that we are not logged into the Command prompt as Administrator.
    2. We can also see that it completed successfully.
    3. And finally if we look on the file system we can see the file got created.
  4. One final thing to note, is that this will run when we execute the BAT file as we have administrator privileges.

Using SQL Server Agent Job to run BAT Files

Below is the process which will actually enable you to run your BAT files under the correct CMD.EXE Context

  1. In order to get this to work, we had to use a combination of SQL Server Agent steps and SSIS steps, which is explained below
    1. First we then needed to use our BAT File to run the PowerShell script which would snap the data.
      1. We configured this step in the SQL Server Agent Job properties as shown below:
      2. As you can see from above we change the Type to be “Operation system (CmdExec)”
        1. The reason for this is that this runs the CMD.EXE in the correct context.
      3. And then we put in our BAT file which calls the PowerShell script which we created earlier.
    2. Next we then ran the BAT file which modified the folder permissions.
      1. We configured this step in the SQL Server Agent Job properties as shown below:
      2. As you can see from above we change the Type to be “Operation system (CmdExec)”
        1. The reason for this is that this runs the CMD.EXE in the correct context.
  2. You can then schedule your SQL Server Agent job and as I tested it did run successfully.

SSIS – Creating a process which will check a condition, if it is not ready wait and try again.

So we had a requirement where we wanted to create a waiting for loop, so that if it was not ready it would wait, try again, and if still not ready wait and try again until it got to the Max Number of tries. We also did not want it to error whilst waiting.

Example:

  • Check to see if the database is online.
  • Wait for 1 minute to retry.
  • Try 10 times before failing
  1. The first that we need to do, is to create some Variables which will help with the process.
  2. From the above this is what they are for:
    1. Max_Count – This is how many times you want to retry
    2. State – This is our check to see if the databases are online or not.
    3. SuccessfulRun – This is used to see if it ran successfully
    4. Value_Counter – This is used as the counter.
  3. Then below are the steps.
    1. The first thing is to assign a value to our Max_Count Variable.
    2. This is done using an Execute SQL Task and assigned it with the following below:
      1. As you can see above in the SQLStatement we have set our Max_Count to 10
      2. Then in the ResultSet we have mapped our results to our Variable
    3. Next is where we configured our For Loop Container with the details shown below:
      1. As you can see above we have assigned it with our Value_Counter Variable.
      2. You will note that in the EvalExpression we have got our Max_Count variable as well as the SuccessfulRun
        not
        equal to zero. This will be explained later how this value is derived.
        1. NOTE: This is also required so that it will exit when it is actually
          successful.
    4. Next in our process within the For Loop is where we start with our Waiting Period as shown below using the Execute SQL Task
      1. All that we are doing here is using a TSQL
        statement to wait for 1 minute.
      2. NOTE: The reason we do this at the start, is so that if it has to loop through, it will wait on the second
        loop before checking.
    5. Next is where we have our condition (which based on our example is to check if the database is online) as shown below:
      1. What our SQLStatement is doing above is checking to see the State of our database.
        1. And if it is online it will have a state of equal to zero
      2. Then in the ResultSet we have mapped our results to our Variable
    6. Next is where we have our Script Task which we use to populate the SuccessfulRun variable as shown below:
      1. We went into Edit the script with the following below:
      2. As you can see above we also assigned the same variable
        SuccessfulRun to the ReadWriteVariables
      3. NOTE: We did use the Microsoft Visual C# 2012 as the script
        language.
      4. From the above all that we added was to give the Variable SuccessfulRun = 1
    7. And then we set our database
      back into Multi User Mode after it was successfully back online as shown below:
    8. The final thing that we had to do, which actually ensure that it loops through without
      erroring is to put in a precedence
      constraint
      between the Check if Database is Online and Check to see if Successful as shown below:
      1. We then configured it with the following for the precedence
        constraint as shown below:
  4. So once this is all done it looks like the following:
  5. You can now run this, potentially
    changing your condition (Check if Database is Online) that you want to check for. As well as you can also change how long you want to wait for before
    checking
    again.

Automating running PowerShell script which required Administrator elevation via SSIS

What we are going to explain below is the following via an example.

Example – We are required to run a PowerShell script which will run with elevated privileges.

So in order to complete this we have to complete the following below but use a simple PowerShell script which will get all the FileNames from a Folder.

  • Create a BAT file which can execute the PowerShell script.
  • Create a Scheduled Task which in turn will call the BAT file.
  • Create an SSIS Package which will call the Scheduled Task

NOTE: The reason that we did this above, was to ensure that we could automate the snapshotting of the database using SSIS. This would enable us to get a snapshot of the data on a required schedule.

Changing the UAC

Below is a quick step to disable the UAC, as I think that this might affect the later steps when the BAT file is trying to run the PowerShell script.

NOTE: You will need to be an Administrator on the Server to turn this off.

  1. Go into the Control Panel.
  2. Then search for UAC.
  3. Then click on the Change User Account Control settings as shown below:
  4. Change it to Never Notify as shown below:
  5. Then click
    Ok
  6. It will ask you one last time if you want to make the change to the Computer.
    1. Click on Yes

Creating the BAT File which can execute the PowerShell script

In the steps below we are going to explain how we create the BAT file which executes the PowerShell script. And also show that at first we can run the PowerShell script manually to ensure that it works.

Testing the PowerShell Script

  1. The first thing to test is to ensure that the PowerShell script actually runs and completes as it should.
  2. Click on Start and then type in PowerShell and from the search results right click on Windows PowerShell ISE and select Run as Administrator
  3. Now when the PowerShell ISE opens you can see from the Window at the top that you are running as Administrator
  4. If you want to follow our example you can then copy and paste the following PowerShell script

    Get-ChildItem
    -Path
    ‘C:\Program Files\Microsoft SQL Server\MSSQL11.PRODBI\MSSQL’
    -Recurse
    |
    Export-Csv
    -Path
    ‘C:\Program Files\Program Files.csv’
    -Encoding
    ascii
    -NoTypeInformation

    1. What we are doing above is to get the File Names for all the files and folders under Program Files and then our SQL Server MSSQL Folder.
    2. And then export it to a CSV.
    3. As with our example once run you should see the CSV file as shown below:
  5. Now we know that our script runs we can now save it to a location on our Server.
  6. As with our example we saved it to a new folder on the C:Drive
  7. We will now use this in the next steps to create the BAT File.

Creating the BAT file which will run the PowerShell script.

In the steps below we will be creating the BAT file which will run the PowerShell script.

  1. Open up Notepad or Notepad++ and copy in the following, which will be explained below:
    1. c:\windows\system32\WindowsPowerShell\v1.0\powershell.exe
      -ExecutionPolicy ByPass -command
      “. ‘C:\XIOSnapshot\Export-MSSQL-Files.ps1′”
    2. What we are doing above is to first put in the full location of PowerShell highlighted in GREEN
    3. Next we are putting in some parameters for PowerShell, which is to bypass the execution policy and the command that is to be run highlighted in PURPLE
    4. And then the final part is where we are calling the PowerShell script which we created above. It is highlighted in RED
      1. NOTE: In order to run the command, we have to put it all in double quotes, as well as it has to have a period in order for it to be executed.
    5. Now we can save this BAT File in the same location with a name.
    6. As per our example we saved it with the following:
  2. Now what we will need to do is to run the BAT file from the command line to test that it all works.
  3. You can now open command prompt and run the bat file as we have shown with our example below:
    1. NOTE: You can see that we are not logged into the Command prompt as Administrator.
    2. We can also see that it completed successfully.
    3. And finally if we look on the file system we can see the file got created.
  4. One final thing to note, is that this will run when we execute the BAT file as we have administrator privileges.

Creating the Scheduled Task which will call the BAT File

Now we are going to create the Scheduled Task which will call the BAT File.

  1. Go into Server Manager, then click on Tools and select Task Scheduler
  2. Once this opens on the right hand side click on Create Task…
  3. The first thing that you have to give your task is a name
    1. As with our example we are going to give it the name as shown below:
    2. Now a VERY important step
      is under Security options make sure to select the Tick box as shown below:
      1. NOTE: With this it means that it will run with the highest privileges and will run as Administrator context
    3. Next if you are going to run this via SQL Server Agent, you are going to have to put in the same domain account
      context that SQL Server Agent Runs Under.
      1. NOTE: When you click ok for the creation of the Scheduled Task it will prompt for the password.
  4. You can leave Triggers blank, because we are going to execute this via SSIS.
  5. Click on Actions
    1. Then click New
    2. Then configure the New Action as shown below, with an explanation afterwards.
      1. Here we are just running our BAT File.
    3. Click Ok.
  6. You can now click Ok for this Scheduled Task to be created.
  7. NOTE: When you click ok for the creation of the Scheduled Task it will prompt for the password.
  8. The final step is to then test the scheduled task by running it within Task Scheduler.
    1. Click on Run
    2. You will briefly get the command prompt Window popping up, which you will not see when running via SSIS.
    3. To confirm this you can view the Last Run Result in Task Scheduler as shown below:
    4. As well as the file on the file system:
  9. Now we are done with the Task Scheduler, calling the BAT file, which is then calling the PowerShell script.

Automating the running of the Scheduled Task in SSIS

In the steps below we are going to show how to automate the running of the Scheduled Task within SSIS

  1. Either create a new SSIS Project or create a new SSIS Package.
  2. Then drag in the Execute Process Task into the Control Flow.
  3. Double click to go into the Properties.
    1. We gave it the following name:
    2. Then click on Process to put in the details to call the Scheduled Task
    3. Now what you will see below is the details to run the Scheduled Task which will be explained afterwards:
      1. From the above under
        Executable, this is where the Scheduled Task exe is saved on our Windows
        Server 2012.
      2. Then under Arguments is where we specified for the Scheduled Task to firstly run (/run) and then the Task Name (/TN) to run
      3. Then click
        Ok.
    4. You should then see the following in your SSIS Package:
  4. Now to test that it all works, execute your SSIS
    Package.
    1. It should then run successfully and you should see the following:
    2. NOTE: That the SSIS will come back immediately saying that was successful if the command is correct.
    3. So if you have something else that is reliant on this step I would suggest putting in something to delay it going onto the next step.
      1. I did this by using the WAITFOR DELAY in SQL Server.
    4. Now we can see the file being updated as shown below:
  5. The final step is to then deploy this to your SSIS
    Server and create the scheduled task and test it.
  6. NOTE: If you have configured the Scheduled
    Task with the same
    domain
    account as your SQL Server Agent
    account it will succeed.
    1. NOTE: This domain account should not have a password that will change.

BI-NSIGHT – SQL Server 2016 CTP 3.0 (SSAS, SSRS, SSIS) – Power BI (Chiclet Visual, SparkPost Content Pack, Weekly Service Update, Personal Gateway Update, Tiles in SharePoint)

I expected this week to be a really interesting week with SQL Pass happening. As I was sure to see some really good and interesting updates from Microsoft and it sure is living up to this.

There has been a lot of information on Twitter and on other blogs, so here is my take on the developments.

SQL Server 2016 CTP 3.0 (SQL Server Database Engine, SQL Server Analysis Services, SQL Server Reporting Services, SQL Server Integration Services)

There was a whole host up dates with SQL Server 2016 CTP 3.0, which is great to see, as well as some announcements of what we can expect in subsequent releases.

I am just going to highlight below what I think is relevant in the BI space. But there will be links below where you can find the related blog posts, which have more information from the Microsoft teams.

SSAS

With regards to SSAS, it is good to see how much effort and work is going into the Tabular model. Which is what I thought would be the case.

I think that it is really great to see that they have changed the underlying structure from XMLA to JSON. The way that I see it, this is how they have implemented Power BI in terms of having the SSAS database sitting in memory in Azure. And without a doubt I am sure that they have learnt a lot, and from this they can then leverage this and bring it into the On Premise product. We all know how fast it is online!

The MDX Support for Direct Query is also a great update. I can see a lot of people leveraging this, and when you partner this with APS you can pretty much start to enable real-time analytics. Which can be a real game changer.

All the other updates that are coming into SSAS have mostly been completed either in Power BI Desktop or in Excel 2016. So it is great to see this in the Server product which will go a long way to ensure that it can scale and perform for enterprise workloads.

SSRS

I have eagerly been waiting to see what was going to happen in the SSRS space. And whilst I had seen some of the now released information it is great to see it being released to the general public. As well as how well it has been received.

The pinning of SSRS reports into Power BI is a really smart move. And the ability to also refresh this report in Power BI is pure Genius. What this means now is you can leverage both of your On Premise and cloud investments. And to the users this will be seamless.

What I also really like is that you can often create really interesting SSRS reports, and the executives and high level managers do not need to see the details. They just want an overview. And now by leveraging this all into Power BI, it becomes their one stop shop!

SSIS

There does not seem to have been a lot of love for SSIS, and to be honest it is a stable and really good product.

But what I did see is the Control Flow Template, and I am hoping that this is something similar to what you can currently do with BIML. What that is how I perceived it to be. And I am hoping that you can create different control flow templates for different control flows. So for example you could create a control flow template for a SCD Type 2. And then once you have it designed the way that you want, any other developers can then utilize it. This would go a long way in enterprises where you want to standardize the way of doing things.

You can read about all of the above here:

Power BI – Chiclet Visual Slicer

The one thing that I have been struggling with in Power BI was how to get a slicer to work, so that it looked good.

And low and behold there is a new visualization which can how do this. And to have it with images also is really smart. As people love to click on Images.

Another great announcement was from James Phillips that Microsoft would be releasing a new visualization every month, indefinitely. This is really great and I am sure that we will see some really interesting and useful visualizations in the future.

You can read all about it here: Visual Awesomeness Unlocked: The Chiclet Slicer

Power BI – SparkPost Content Pack

This week there is another interesting and great Content Pack. This time for SparkPost. Which you can now use to monitor your Email campaigns.

You can read about it here: Monitor Your SparkPost data with Power BI

Power BI – Weekly Service Update

Not only was there a host of announcements at SQL Pass, there was the weekly Power BI Service update.

Once again I am going to quickly highlight what there is in this week’s update.

They have made quite a few improvements with regards to the way we can share the dashboards in Power BI. All of these updates make it a lot easier to share the dashboard and to enable people to see how good Power BI is. The additions are (Sharing the Dashboards with AD Groups, People Picker and Sharing with a large number of Email addresses)

Along with this is the ability to start passing parameters into the URL. I have no doubt that passing URL parameters will keep on increasing and giving additional flexibility in the Power BI service.

You can read about it here: Power BI Weekly Service Update

Power BI – Personal Gateway Update

There was an update late last week for the Power BI Personal Gateway and it is mostly around bug fixes and performance improvements. Which is great to see because I do know that often we want it to run as smoothly and quickly as possible

You can find more information here: New version of Personal Gateway is now live!

Power BI – Tiles in SharePoint

And finally the guys from DevScope have now created a Power BI Tile for SharePoint.

I think that this will work really well, because it will give the ability to showcase all the work done in your Power BI reports, as well as not having to re-create reports over and over again.

If you want to find more details and pricing, you can find it here: Power BI Tiles for SharePoint

SSIS – Using SSIS Variable within a SQL Script with Temp Tables – Error at Data Flow Task : No Column information was returned by the SQL Command

Below details what I was trying to do within SSIS and how I got it to work.

 

Overview

  1. I was using a SQL Script which contained some temp
    tables which was used in the result set.
  2. When I then put this into an OLE DB Source I would get the error shown below:

 

  1. I then tried to see if it would work when I created a SQL Command from Variable.
    1. And this would result in the same error as above.
  2. If I did click on OK, I would then the following in the Columns window as shown below:
  3. So below is a solution which I found to work to get the data from a SQL Query that uses temp tables.

 

Example Data

For our solution below we are going to use the following sample data.

  • The first
    variable that we are going to be using will be for a DayNumber
    • NOTE: This is because we want to go back in time, so we want to go back and loop through the past 10 days.
  • The second
    variable that we are going to be using is our actual SQL Query.
    • So this will be very simply put with the following below:
    • NOTE: This is a very simple
      query and I want to just use this as an example. I am certain in most of your scenarios it will be a much more complex query that is giving you the above error within SSIS.

 

Solution

  1. The first thing that you will need to do, is to create an ADO.NET Source.
  2. Next you will need to ensure that you have two
    variables created
    1. The first
      variable that you want to pass into your SQL
      Statement
    2. The second
      variable is your SQL query.
  3. As with our example above we created the following variables within SSIS
    1. From the above you will see that we have created some additional
      variables as explained below:
      1. DayNumber is the day number variable that we want to pass to our SQL Query.
      2. StartNumber is the starting variable in our For Loop Container
      3. EndNumber is the ending variable in our For Loop Container
      4. Query_GetDayNumber is the query which is going to extract our Day Number and put this into a variable.
      5. Query_ExtractData is the SQL Query where we are going to extract our data including passing the variable.
  4. Next we will create and configure our For Loop Container with the following:
    1. NOTE: This is so that we can then know when to exit our of our For Loop Container
    2. NOTE II: Typically, you will have your StartNumber and EndNumber
      populated by an Execute SQL Task so that it can always be dynamic.
  5. Next we will configure our variable for the Query_GetDayNumber so that it gets populated with the correct details as it loops
    through each time.
    1. Click on Variables, then where we have the Query_GetDayNumber
      click on the Ellipses button on the right
      hand side under
      Expression
    2. This will then open the Expression
      Builder
      Window
    3. Now as with our example, we know that our
      number will always start at 1 and the DayNumber will always be the same as the StartNumber.
      1. NOTE: There will be situations where you will need to run an actual SQL Query against your data to extract the required information.
    4. And we put in the following:
      1. NOTE: The reason for converting it to a string is because our StartNumber
        variable is defined as an Int32.
      2. You can then click on the Evaluate
        Expression to make sure that it is working.
    5. Then click Ok.
  6. Next we will need to configure our variable for the Query_ExtractData, so that we can then pass our above variable within our SQL Query by doing the following:
    1. Click on Variables, then where we have the Query_ ExtractData
      click on the Ellipses button on the right
      hand side under
      Expression
    2. Now we will put in our SQL Query from above, along with the variable as shown below:
    3. NOTE: If you have a look at the above
      syntax you will see that we have put our variable
      DayNumber into our Expression.
      1. You can then click on the Evaluate Expression to make sure that it is working.
    4. Then click
      Ok.
  7. Next create an Execute SQL Task and configure it with the following below so that it will be used to populate our DayNumber
    variable.
    1. Go into the Properties and configure the General
      page with the following below:
    2. NOTE: A few quick things to note on the above:
      1. We have set the Result Set to Single Row
        1. This is to allow our variable to be returned into a result set.
      2. Our SQLSourceType has been set to Variable.
        1. This is so that every time the For Loop Container runs and loops through it will then get the new value.
      3. SourceVariable
        1. This has been selected from the drop down and will be our query which we created earlier.
        2. Which on each execution of the For Loop Container will go and get the values we require.
    3. Then click on the Result Set and put in the following:
    4. Then click
      Ok.
    5. Now drag this into your For Loop Container
  8. Next drag in a Data Flow Task and rename it to Extract Data
  9. Then double
    click and go into your Data Flow Task
  10. Now Drag in an ADO NET Source.
    1. Then ensure that you have not selected the ADO NET Source and have clicked on the sheet within SSIS, and go into the properties.
    2. Next to Expressions
      click on the Ellipses
      Button
    3. Under Property click on the Drop Down and select the following as shown below
    4. Then click on the Ellipses next to Expression.
    5. Now in the Expression Window put in the following variable as shown below:
      1. NOTE: From the above you will see that this is our Variable query that we created and configured in step 6 above.
    6. Click on the Evaluate
      Expression and you will then see the actual query below in the Evaluated value:
      1. NOTE: In the above it has gotten the number 12 from the default
        value in our Variable for DayNumber.
    7. Then click Ok.
    8. Now go back into your ADO NET Source.
      1. Ensure that you have got the correct connection to your data source.
      2. Then under Data access mode
        change this to SQL Command
      3. Then you should see your query populated below as with our example:
      4. NOTE: It got this query
        information from our Expression, which in turn got the information from our variable.
    9. You can then click on Columns to see that the query
      runs and gets the required
      information.
    10. Then click Ok.
  11. Then the final step is to then link it to your destination table.
    1. NOTE: You can use either an OLE DB Destination or an ADO NET Destination.
  12. Next make sure that you drag your Data flow task into your For Loop Container
    1. Then ensure that you place the Success
      Precedence
      Constraint
      between your Execute SQL Task and your Data Flow Task as shown below.
  13. Now you can run your SSIS package and it should work
    successfully.

This post was published to myfriendjoobs at 1:57:59 PM 8/5/2015

SSIS – Using SSIS Variable within a SQL Script with Temp Tables – Error at Data Flow Task : No Column information was returned by the SQL Command

 

 

 

BI-NSIGHT Power BI Sweet IQ , Summer Release – SSIS Feature Pack for Azure – Excel 2016 New Chart Options (Treemap, Sunburst, Histogram, Box & Whisker, Waterfall)

Well this week there were some updates and fortunately not too many so where we go!

Power BI – Sweet IQ

This week, there was yet another release of a content pack for Power BI, this time it was relating to a company called Sweet IQ.

They position themselves as provider that can leverage your local search and increase your visibility on the web. Whilst looking at the blog post, I have to say that this is once again a great way for customers to get quick and easy access to their data. And enable the customers to interact and see how it is performing.

You can read about the post from Microsoft here: Analyze and Monitor your SweetIQ Data with Power BI

Summer Release of Power BI

I am not going to go into too much detail around this, because all the information that I could find was on the blog post from Microsoft. But it does bode well for the next release of the Power BI Designer.

And for me I think this is where I would start at looking the Power BI Designer over say using Excel as a source or using the web based version to author reports.

As you can see from the above screenshot, it appears that there is going to be a whole host of things that you can customize within the Power BI Designer. And this will make their existing reports, which look really good, go from good, to be a standout. Especially since from what I can see above, I have not seen this in other products of similar nature.

You can read the blog post from Microsoft here: We’re on for a great Designer Summer…

SSIS Feature Pack for Azure

As you can see from the above screenshot, these are the items that are available in the SSIS Feature pack for Azure.

I personally think that this is great, as in the past, it has been a bit of trickery to get data from On Premise into Azure. Personally I know of a few people who have had some struggles. As well as a lot of people who do know and understand SSIS, really well. So this would fit in with their existing skills.

I also did see that the Varigence have created a webinar for this here: Moving ETL Loads to Azure Part1 Webinar Content

I have personally not tested it yet, but I am sure that it will be a lot easier to get data into Azure!

You can read up all about it on MSDN here: Azure Feature Pack

As well as downloading the SSIS Feature Pack here: Microsoft SQL Server 2014 Integration Services Feature Pack for Azure | Microsoft SQL Server 2012 Integration Services Feature Pack for Azure

Excel 2016 – New Charting Options (Treemap, Sunburst, Histogram, Box & Whisker, Waterfall)

Every week or so, I check to see if there are any Office 2016 updates, and this week there was!

So one of the things that I have been looking for since watching the Ignite video’s is for the new chart types. And finally this week, we have the new chart types in Excel 2016!

The one of the things that I can currently note, and I do hope that this will not be the case once the final product is released, is that it says that it cannot be used with a Pivot Table. I might not totally understand the dynamics behind how the chart is created. But how awesome would it be if we could use it with our data from Power Pivot, or our SSAS Cubes!

I have also noticed that they have changed the multi select button on your slicer to look a little more attractive as shown below.

There might be a few other changes, but currently I have not seen them or found them out!

That is it for this week!

BI-NSIGHT – SQL Server 2016 – Power BI Updates – Microsoft Azure Stack

Well I have to admit that it seems that the Microsoft machine has been working flat out to get out new products and updates.

If my memory serves me, this is the third week in a row that Microsoft has released new products and updates. I am really enjoying it! But hopefully this will slow down, so that we can catch our breath and actually play with some of the new products and features.

So let’s get into it. There is quite a lot to go over!!

SQL Server 2016

So with Microsoft Ignite happening this week, the wonderful guys from Microsoft have started to announce what we can expect to be in the next version of SQL Server.

I am going to focus mainly on the BI (Business Intelligence) features, and there are quite a few! Some of what I am detailing below I have only seen pictures on Twitter, or I have read about it. As well as the preview not even being released yet, I am sure that there will be some changes down the line.

SQL Server Reporting Services

It finally appears that Microsoft has been listening to our cries and requests for updates in SSRS (SQL Server Reporting Services)!

Built-in R Analytics

I think that this is really amazing, even though I am not a data scientist, I think it is a smart move my Microsoft to include this. What this means in my mind is that you can now get the data scientists to interact and test their R scripts against the data as it sits in the transactional environment. And from there, this could then be used to create amazing possibilities within SSRS.

Power Query included in SSRS

From what I have seen people tweeting about as well as what I have read, it would appear that Power Query will be included in SSRS. This is fantastic and will mean now that virtually any data source can be consumed into SSRS.

New Parameter Panel, chart types and design

I am sure we can all agree, that SSRS has required an overhaul for some time. And it seems that finally we are going to get this. It would appear that the parameters panel is going to be updated. I do hope that it will be more interactive and in a way react similar to the way the slicers do in Excel. As well as getting new chart types. Here again I am going to assume that it will be similar to what we have in Power BI!

And finally there was also mention that the reports will be rendered quicker, as well as having a better design. I also am hoping that they will deploy this using HTML 5, so that it can then be viewed natively on any device. It is going to be interesting to see what Microsoft will incorporate from their acquisition of Datazen.

SQL Server Engine

Built-in PolyBase

Once again, this is an amazing feature which I think has boundless potential. By putting this into the SQL Server Engine this means that it is a whole lot simpler to query unstructured data. Using TSQL to query this data means that for a lot of people who have invested time and effort into SQL Server, now can leverage this using PolyBase. And along with this, you do not have to extract the data from Hadoop or another format, into a table to query it. You can query it directly and then insert the rows into a table. Which means development time is that much quicker.

Real-time Operational Analytics & In-Memory OLTP

Once again the guys at Microsoft have been able to leverage off their existing findings with regards to In Memory OLTP and the column store index. And they have mentioned in their testing that this amounts to 30x improvement with In-memory OLTP as well as up to 100x for In-Memory Column store. This is really amazing and makes everything run that much quicker.

On a side note, I did read this article today with regards to SAP: 85% of SAP licensees uncommitted to new cloud-based S/4HANA

I do find this very interesting if you read the article. What it mentions is that firstly 85% of current SAP customers will not likely deploy to the new S/4HAHA cloud platform. Which in itself does not tend well for SAP.

But what I found very interesting is that to make the change would require companies to almost start again for this implementation. In any business where time is money, this is a significant investment.

When I compare this to what Microsoft has done with the In-Memory tables and column store indexes, where they can be used interchangeably, as well as there is some additional work required. On the whole it is quick and easy to make the changes. Then you couple this with what Microsoft has been doing with Microsoft Azure and it makes it so easy to make the smart choice!

SSAS (SQL Server Analysis Services)

I am happy to say that at least SSAS is getting some attention to! There were not a lot of details but what I did read is that SSAS will be getting an upgrade in Performance usability and scalability.

I am also hoping that there will be some additional functionality in both SSAS OLAP and Tabular.

SSIS (SQL Server Integration Services)

Within SSIS, there are also some new features, namely they are also going to be integrating Power Query into SSIS. This is once again wonderful news, as it means now that SSIS can also get data from virtually any source!

Power BI

Once again the guys within the Power BI team have been really busy and below is what I have seen and read about in terms of what has been happening within Power BI

Office 365 Content Pack

I would say that there are a lot of businesses that are using Office 365 in some form or other. So it makes perfect sense for Microsoft to release a content pack for Office 365 Administration

As you can see from the screenshot below, it gives a quick overview on the dashboard to see what activity is happening. As well as details of other services. I am sure that this will make a quick overview of your Office 365 systems really easy to see. And also if there are any potential issues, this could also be highlighted!

Visual Studio Online Content Pack

Another content pack that is about to be released is for people who use Visual Studio Online, I personally do not currently use this. But it does look great for people to once again have a great overview of what is going on.

You can read more about it here: Gain understanding and insights into projects in Visual Studio Online with Power BI

And as you can see below, what you can view once you have got it setup within Power BI.

Power BI planned updates

Below are the updates that I had previously voted for in Power BI. It is great to see that the Microsoft team is actively listening to their customers and implementing some of the idea’s. I have to say that I do not think that there are many other software companies that are doing this currently. And also being able to roll it out as quickly as Microsoft is.

Set Colors and Conditional Formatting in visuals

  • This is great as it will allow the report authors to have more control in terms of how their reports look.

Undo / Redo button in browser & designer

  • This might seem like a small update, but I know personally from working with the Power BI reports, that sometimes you just want to see what a different report looks like. Or adding another element. And with the Undo / Redo buttons, it just saves that little bit of time, as well as to make the report authoring experience that much more enjoyable.

Power BI Announcements from Microsoft Ignite

Below are some announcements that I have read up about either on Twitter or one of the blogs that I follow. It is really great to see so many things in the pipeline.

This means that there is a lot to look forward to, as well as ensuring that we have new and wonderful things to show.

I got this picture via Twitter, which someone must have taken at the Microsoft Ignite Conference. As you can see it is not very clear, but it does show the next update in the Power BI Designer.

You can also see the undo and redo buttons.

It also appears that in the Power BI service, there will be support for SSRS files, namely the .rdl files! Here is another picture taken from Microsoft Ignite.

 

Then there is the Many to Many relationships and bi directional cross filtering will be supported in SQL Server 2016 tabular models, which I am sure will also be included in the new Power BI backend. As this is where it stored all the data.

 

For Hybrid BI, there will be support for live querying for SSAS, currently this is already in place for SSAS Tabular.

 

It also looks like there will be a scheduled refresh for SQL Server Databases as a source. Which is great for people who either do not have either of the SSAS cubes, or want to get some of their data into Power BI.

Microsoft Azure Stack

While this is not exactly BI, it is related to BI, in that with Azure Stack you can get the Azure functionality on your own hardware which is fantastic. And I am sure for a lot of businesses this will be welcomed.

You can read more about it here: Microsoft Brings the Next Generation of Hybrid Cloud – Azure to Your Datacenter

 

SSIS – Configuring SSIS Package to run job after data loading has completed

  • In our example below, we are going to be using a SharePoint 2013 Data Alert job which was created in SQL Server Agent, and then insert this into an existing
    SSIS
    Project.
  • The reason for doing this, is so that every time our SSIS Project runs, as part of the process, it can then fire off a SQL Server Job.
  • Essentially in the past the SQL Server Agent Jobs run
    individually and are not tied into when our data has loaded.
  • By doing it in this manner, it enables the related job to be run after every data load. Which then enables the business to get the relevant data at the relevant times.

     

NOTE: You can modify this for any SQL Server Job.

Find the related SQL Server Agent Job Name

Below details how to find the related SQL Server Agent Job name. As typically when you create a SharePoint Data Alert, or SQL Server Reporting Services (SSRS) Data
Driven Subscription or Email Subscription, it creates the job name with a GUID.

This makes it rather difficult to find the correlating Job Name.

  1. The easiest way to find the SQL Server Agent Job Name, is once you have created your SharePoint Data Alert, or SSRS (Data Driven Subscription or Email Subscription) is to go directly into your SQL Server Agent Job Monitor where the Job was created.
  2. Then open the Job Activity Monitor
  3. Then scroll through the list until you find a job with the following
    properties
    1. The name will be in a GUID format:

    2. The Last Run Outcome is set to Unknown
      1. And the Last Run is set to never

    3. NOTE: If there is more than one job with the above properties, then look to see when you scheduled the start time of your
      job.
      1. If that still does not help you might have to go into the job
        properties and into the schedules to see how often it runs to find the correct job.
  4. If this is in a Test
    environment, you can try and run the job to ensure that you do have the correct job.
  5. Make a Note of the Name of the Job
    1. In our example it was: 68CAE336-3D38-42A6-9526-F32F4D501AA4

 

Modifying the SQL Server Agent Job

Below are the steps to modify the SQL Server Agent job, so that it will not run on a schedule.

NOTE: The reason we are doing this is because we want the job to be run from within our SSIS Project after it has loaded our relevant data.

  1. Go into the properties of your SQL Server Agent Job and click on the Schedules
    Page

  2. Then in the Schedule
    List
    click on the Schedule and select
    Remove.
  3. Once completed there will now not be any schedules for the job.

  4. Then click Ok.

 

Creating and configuring your SSIS Package to run the SQL Server Agent Job for you

Below are the steps to create and configure your SSIS package to run the SQL Server Agent Job as part of the SSIS Project.

  1. Create your new SSIS Package.
  2. Drag in an Execute SQL Task
  3. We renamed our Execute
    SQL
    Task to the following:

    Run SharePoint Data Alerts


  4. Then we went into the properties.
    1. We then put in our Connection to our SQL Server Database, which is the same as the where the SQL Server Agent Job is
    2. Then under the SQL Statement we put in the following:

      EXEC msdb.dbo.sp_start_job N’68CAE336-3D38-42A6-9526-F32F4D501AA4′ ;

      1. NOTE: The thing to note here is that we are using the MSDB
        Stored
        Procedure to start our Job.
        1. And that we are using the Job
          name which we noted in previous steps.
  5. Now that you have completed your Execute SQL Task, it is time to test it.
  6. Right click and select Execute SQL Task.
  7. If successful it should come back looking like the following below:

  8. As well as actually getting the email in your Inbox, as is the case with our SharePoint Data Alert.

SSIS – Error Handling and Error emailing for Packages

What the steps below do is to handle error handling for your SSIS packages. And then within the error handling enable it so that you can email the exact error to email addresses so that they are notified when there is an error and what the error was.

 

Example:

·         In our example we are going to be using a package named: SSAS-DroppingPartitions

·         This is what will be referenced below.

 

Enabling Error Handling in an SSIS Package

1.       Open your SSIS Package

2.       Then click on Event Handlers at the top.

3.       Then click on:

a.        Click here to create an ‘On Error’ even handler for executable ‘Package Name’

b.       EG

                                                               i.      Click here to create an ‘On Error’ even handler for executable ‘SSAS-DroppingPartitions’

                                                              ii.      clip_image002[8]

4.       Now you will see that your package should have the following at the top.

                                                               i.      clip_image004[8]

5.       Now your package is ready for specific event Handlers to be added.

6.       Below is what it will look like with the Event Handler Enabled

Enabling your package error to be emailed

Adding an SMTP – Connection Manager for the Send Mail Task

1.       The first thing that you need to do is to setup a new connection which will enable you to send the actual emails by doing the following below.

2.       Right click in the Connection Managers window and select the following:

a.        New Connection.

3.       Then click on:

a.        SMTP Connection manager for the Send Mail task

4.       Then click Add

5.       This will then open the SMTP Connection Manager Editor Window and you will need to fill in the following:

a.        Name:

                                                               i.      SMTP-Mail.Mailserver.com

                                                              ii.      NOTE: This is the name of our mail server which we are sending the email.

b.       Description:

                                                               i.      SMTP-Mail.Mailserver.com

c.        SMTP Server

                                                               i.      Mail.Mailserver.com

                                                              ii.      NOTE:

1.       This is the actual DNS address for your SMTP Server

d.       Tick

                                                               i.      Use Windows authentication.

                                                              ii.      NOTE:

1.       This is because in our current setup the mail server requires Windows Authentication in order to send out emails.

e.       Then click Ok.

6.       Now you will see your SMTP connection in your connection manager’s window.

7.       clip_image002[4]

 

Adding the Send Mail Task and configuring the Send Mail Task

1.       What we are going to do is to configure the send email task to send email to the people who require the email. And then configure what gets send out as part of the email.

2.       Click in the Toolbox and drag the Send Email task into your Event Handler Window

3.       Right click on the Send Mail Task and click on Edit

4.       Under General put in the following for the Name and Description

a.        Send email if Task Fails

5.       Then click on Mail in the left hand side.

a.        Where it says SmtpConnection click on the drop down and select the SMTP connection that you created in the section called Adding an SMTP – Connection Manager for the Send Mail Task above.

                                                               i.      EG:

                                                              ii.      SMTP-Mail.Mailserver.com

b.       Where it says From:

                                                               i.      This must be a valid FROM address that the mail server receiving the required email to relay will accept.

                                                              ii.      EG:

1.       user@domain.com

2.       NOTE: In some mail server setups you will have to ensure that the from email address is allowed to replay via your mail server.

c.        Where it says To:

                                                               i.      This is to whom you want to send the emails to.

1.       usertoemail@domain.com

                                                              ii.      Where it says Subject I put in the following:

                                                            iii.      SSIS Error: Package Name

                                                            iv.      EG:

1.       SSIS Error: SSAS-DroppingPartitions

d.       Where it says MessageSourceType this must be left as the default which is:

                                                               i.      Direct Input

e.       We will configure the Message Source in the next step.

f.         So once complete for now it will look like the following:

g.        clip_image004[4]

6.       Next is where we are going to dynamically input our error message and details per SSIS package using System Variables and expressions.

7.       To configure the Message Source type click on the Expressions in the left hand side.

a.        Click on the plus sign next to Expressions

b.       Then click on the Ellipses button, this will open the Property Expressions Editor

                                                               i.      Click under Properties and click on the drop down button.

                                                              ii.      Select the following:

1.       Message Source

                                                            iii.      Then once again click on the Ellipses button.

c.        This will open the Expression Builder

d.       Now this is where you will add the following into your Expression builder, an explanation will be below.

“Package:                              “+ (DT_WSTR, 50)  @[System::PackageName] +”.

Time:                                      ” + (DT_WSTR, 50) @[System::StartTime]  +”.

Task:                                       “+  (DT_WSTR, 50) @[System::SourceName]  +”.

Error Description:                ” + (DT_STR, 2000,1252)  @[System::ErrorDescription]

                                                               i.      NOTE: If you want to format your text, use Notepad and then copy and paste it from notepad into your Expression Window.

e.       What the above does the following:

                                                               i.      It starts with the Package name

1.       And then the package Name variable.

                                                              ii.      Then it is the time of the error

1.       And then the StartTime variable.

                                                            iii.      Next is the Task Name

1.       With the TaskName Variable

                                                            iv.      Finally is the Error Description

1.       With the ErrorDescription variable

2.       NOTE: Because the error is text you have to change the data type from the default which is DT_WSTR to DT_STR

a.        With this you then need to add the length and then also the code page

                                                                                                                                       i.      In our example which works is the code page 1252

f.         Then click on Evaluate Expression

                                                               i.      This should then come back with the expression in the Expression Value window above the Evaluate Expression button.

                                                              ii.      Below is what the Expression looks like

                                                            iii.      clip_image006[4]

                                                            iv.      Click Ok.

g.        Now to add an expression for our subject click on Property drop down in the Property Expressions Editor.

                                                               i.      Select Subject

1.       Then once again click on the Ellipses button.

                                                              ii.      This will open the Expression Builder

                                                            iii.      Now this is where you will add the following into your Expression builder, an explanation will be below.

“SSIS Error: ” +  (DT_STR, 50,1252)  @[System::PackageName]

1.       What this does is just takes the Package Name and puts it into the subject line in our email.

h.       Then click on Evaluate Expression

                                                               i.      This should then come back with the expression in the Expression Value window above the Evaluate Expression button.

                                                              ii.      As shown below is what it looks like

                                                            iii.      clip_image008[4]

i.         Then click Ok 3 times to get out of the Send Mail Task Editor.

8.       Once complete it will look like the following:

a.        clip_image010[4]

b.        

 

Testing your Send Mail Task

1.       The final step is to test to make sure that the Send Mail task works.

a.        NOTE: I would only do this for the first Send Mail task, because if this works it should work for all other packages as well.

2.       Go into your Control Flow and deliberately change a task so that it fails.

a.        NOTE: In our example I changed an Execute SQL task, and then change the Stored Procedure name so that I knew it would fail because the Stored Procedure does not exist.

3.       Once you have saved your change then run the package in Debug mode.

4.       When the package runs it will FAIL which is what we want.

a.        Now if you go and look in the Event Handlers Window you should see your Send email if Task Fails as green.

b.       clip_image012[4]

5.       You should also receive the email with the error as you configured above.

SSIS – Using Date and Time Cache Lookups for faster inserts into Fact Tables

What happened is that I found that when doing inserts on larger data sets, it would take a long time to complete the date and time joins. If I excluded this the data would be returned a lot quicker.

 

So my solution to the problem was to use the Cache Transform and then the Loopkup Transform in the SSIS Data flow task. This enabled the query to be returned a lot quicker.

 

Example:

·         We are first going to create our Cache Transform for our Date and Time

·         Then we are going to insert data into our Fact table from the following Adventure Works table below using the Lookup Transformation

o    dbo.FactInternetSales

·         We will be using SSDT and SQL Server 2014

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Creating the Cache Transform for Date and Time

NOTE: We inserted a time table into the AdventureWorksDW2012 database, which has every minute for a day which will return 1440 rows.

 

NOTE II: You will only have to create the Date and Time Cache Transform once. Unless your Date table changes every time you load your data. This is because in the later steps you will see how you just select your file.

 

1.       We will first start with setting up the Date Cache Transform.

a.        Go into your SSIS and create a new SSIS Package.

b.       Then drag in a data flow task, which we renamed to Date Cache Transform

c.        Then double click to go into the Date Cache Transform data flow task.

d.       Now in the SSIS Toolbox, drag in the ADO NET Source

                                                               i.      NOTE: The reason we use the ADO NET Source, is because it is better at getting the correct Data Types for our Cache Transform.

e.       We then configured it with the following as shown below

                                                               i.      clip_image002

                                                              ii.      NOTE: The reason that we converted our FullDateAlternateKey to Date, is because when we do our lookup later, we want it to have JUST the Date Data Type

f.         If we went into the Advanced Editor for the ADO NET Source, we can verify that the FullDate is the Data type of “Database Date

                                                               i.      clip_image004

g.        Now in the SSIS Toolbox, click under Other Transformations and drag in the Cache Transform

                                                               i.      clip_image006

h.       Then drag the blue constraint to your Cache Transform.

i.         Then double click your Cache Transform to go into the Properties

j.         Now when it opens you will see that there is no Connection Manager. Click on New

                                                               i.      clip_image008

k.        This will then open the Cache Connection Manager Editor

l.         We then configured it as shown below:

                                                               i.      clip_image010

                                                              ii.      As you can see above we put in the name.

                                                            iii.      Then we also selected to Use file cache.

1.       Then we configured the File name with a UNC

2.       NOTE: The reason that we did this is so that if you execute this package from another location it will still find the location.

3.       NOTE II: You must ensure that the account running the SSIS job has access to the folder location and file on the server.

m.      Then click on Columns. This is where you will configure which column will be used for the actual lookup.

                                                               i.      In our example we want to do the lookup on the FullDate, so next to FullDate we will put the Index Position of 1

                                                              ii.      NOTE: You can have more than one lookup, and each additional Index Position will increment in number.

                                                            iii.      clip_image012

n.       Then click Ok to go back to the Cache Transformation Editor.

o.       Now click on the Mappings in the left hand side

p.       You should now see the following:

q.       clip_image014

r.        NOTE: From above you can see because we created the Index Position of 1 for the FullDate, it has got the magnifying glass. Which represents that this can be used in the Lookup later.

s.        Click Ok.

t.         You can now run this data flow task to populate your Date Cache Transform.

u.       clip_image016

2.       Next will be setting up the Date Cache Transform.

a.        Go into your SSIS and create a new SSIS Package.

b.       Then drag in a data flow task, which we renamed to Time Cache Transform

c.        Then double click to go into the Time Cache Transform data flow task.

d.       Now in the SSIS Toolbox, drag in the ADO NET Source

                                                               i.      NOTE: The reason we use the ADO NET Source, is because it is better at getting the correct Data Types for our Cache Transform.

e.       We then configured it with the following as shown below

                                                               i.      clip_image018

f.         If we went into the Advanced Editor for the ADO NET Source, we can verify that the Time is the Data type of “database time with precision

                                                               i.      clip_image020

g.        Now in the SSIS Toolbox, click under Other Transformations and drag in the Cache Transform

                                                               i.      clip_image006[1]

h.       Then drag the blue constraint to your Cache Transform.

i.         Then double click your Cache Transform to go into the Properties

j.         Now when it opens you will see that there is no Connection Manager. Click on New

                                                               i.      clip_image021

k.        This will then open the Cache Connection Manager Editor

l.         We then configured it as shown below:

                                                               i.      clip_image023

                                                              ii.      As you can see above we put in the name.

                                                            iii.      Then we also selected to Use file cache.

1.       Then we configured the File name with a UNC

2.       NOTE: The reason that we did this is so that if you execute this package from another location it will still find the location.

3.       NOTE II: You must ensure that the account running the SSIS job has access to the folder location and file on the server.

m.      Then click on Columns. This is where you will configure which column will be used for the actual lookup.

                                                               i.      In our example we want to do the lookup on the Time, so next to Time we will put the Index Position of 1

                                                              ii.      NOTE: You can have more than one lookup, and each additional Index Position will increment in number.

                                                            iii.      clip_image025

n.       Then click Ok to go back to the Cache Transformation Editor.

o.       Now click on the Mappings in the left hand side

p.       You should now see the following:

q.       clip_image027

r.        NOTE: From above you can see because we created the Index Position of 1 for the Time, it has got the magnifying glass. Which represents that this can be used in the Lookup later.

s.        Click Ok.

t.         You can now run this data flow task to populate your Time Cache Transform.

u.       clip_image029

3.       Now if you are using SSIS 2012 or 2014, we will convert the Date and Time Cache Connections to Project connections.

a.        This is so that later we can select them from our other SSIS Packages.

b.       Right click on each Cache Connection Manager and select Convert to Project Connection

                                                               i.      clip_image031

c.        They will now look like the following in the Connection Managers

d.       clip_image033

 

Creating your TSQL Query so that it can be used for the Lookup Transformation

In the steps below we need to ensure that we create the correct data types in our TSQL Query so that when we do the lookup it will map correctly.

 

NOTE: If the data types within SSIS are not an exact match between your TSQL Query and the Lookup Transformation it will not allow you to create the mapping.

 

1.       Below is the TSQL Example from our dbo.FactInternetSales table.

SELECTTOP 1000 [ProductKey]

      ,[OrderDateKey]

      ,[DueDateKey]

      ,[ShipDateKey]

      ,[CustomerKey]

      ,[PromotionKey]

      ,[CurrencyKey]

      ,[SalesTerritoryKey]

      ,[SalesOrderNumber]

      ,[SalesOrderLineNumber]

      ,[RevisionNumber]

      ,[OrderQuantity]

      ,[UnitPrice]

      ,[ExtendedAmount]

      ,[UnitPriceDiscountPct]

      ,[DiscountAmount]

      ,[ProductStandardCost]

      ,[TotalProductCost]

      ,[SalesAmount]

      ,[TaxAmt]

      ,[Freight]

      ,[CarrierTrackingNumber]

      ,[CustomerPONumber]

      ,[OrderDate]

      ,[DueDate]

      ,[ShipDate]

      ,[DistinctCustomers]

      ,Convert(date,[OrderDate])as DateForCache

      ,Cast(convert(varchar(5),convert(time(0),[OrderDate]))+‘:00’asTime(0))as TimeForCache

  FROM [AdventureWorksDW2012].[dbo].[FactInternetSales] with (nolock)

2.       As you can see above we have converted out OrderDate to the Date data type.

3.       We have also converted our OrderDate to only show use the Time, using the data type of Time(0)

a.        NOTE: If this is not done in this way, when the time goes over 12pm it would then display it was 01pm instead of 13:00

4.       Now we can use the above query in our data flow task to Insert into our fact table.

 

Using the Lookup Transformation for inserting data into our Fact table

Below we will now use our TSQL Query, as well as our Lookups to our Date and Time Cache Transform

 

1.       Go into SSIS, and go to your SSIS Package where you want to insert your data into the Fact table.

2.       Then drag in your data flow task.

a.        If needed rename your data flow task.

3.       Now drag in an ADO NET Source.

a.        NOTE: The reason for the ADO NET Source is so that it will get the correct data types from our query.

b.       I have found in the past that the OLE DB Source at times does not find the correct data types for the Lookup Transform.

4.       We then put in the query from step 1 in the section above.

a.        NOTE: We went into the Advanced Editor to ensure that the Date Types for the DateForCache and TimeForCache columns are correct.

b.       clip_image035

c.        clip_image037

5.       Now we need to create our first lookup for our Date Cache Transform.

a.        In the SSIS Toolbox drag in the Lookup under the Common section

                                                               i.      clip_image039

b.       We then renamed the Lookup to Date Lookup

c.        Then drag the Blue constraint to our Date Lookup.

d.       Double click on the Date Lookup to go into the Properties

e.       On the General section we configured it with the following:

                                                               i.      clip_image041

                                                              ii.      NOTE: The reason we chose the Cache Connection Manager is so that we can use the cache file we created earlier.

f.         Then click on the Connection.

g.        If you configured your Date and Time Cache Connections as Project Connections you will see them in the Cache Connection Manager

                                                               i.      Click on the drop down and select Date Cache Connection

                                                              ii.      clip_image043

h.       Then click on Columns, and we configured it with the following:

i.         clip_image045

j.         As you can see above we have used our Input column which was DateForCache, and then mapped it to our Lookup Column called FullDate which we configured earlier.

k.        We then selected the DateKey as our Output Alias

l.         Then click Ok.

6.       Next we configured our  lookup for our Time Cache Transform.

a.        In the SSIS Toolbox drag in the Lookup under the Common section

                                                               i.      clip_image039[1]

b.       We then renamed the Lookup to Time Lookup

c.        Then drag the Blue constraint to our Time Lookup.

d.       When you do this it will pop up with the Input Output selection window

                                                               i.      For the Output select the Lookup Match Output

                                                              ii.      clip_image047

                                                            iii.      Click Ok.

e.       Double click on the Time Lookup to go into the Properties

f.         On the General section we configured it with the following:

                                                               i.      clip_image041[1]

                                                              ii.      NOTE: The reason we chose the Cache Connection Manager is so that we can use the cache file we created earlier.

g.        Then click on the Connection.

h.       If you configured your Date and Time Cache Connections as Project Connections you will see them in the Cache Connection Manager

                                                               i.      Click on the drop down and select Time Cache Connection

                                                              ii.      clip_image049

i.         Then click on Columns, and we configured it with the following:

j.         clip_image051

k.        As you can see above we have used our Input column which was TimeForCache, and then mapped it to our Lookup Column called Time which we configured earlier.

l.         We then selected theTimeKey as our Output Alias

m.      Then click Ok.

7.       Now finally drag in your ADO NET Destination.

a.        Drag the blue constraint to your ADO NET Destination.

b.       When you do this it will pop up with the Input Output selection window

                                                               i.      For the Output select the Lookup Match Output

                                                              ii.      clip_image052

                                                            iii.      Click Ok.

c.        Then go into our ADO NET Destination Properties and select your destination Fact table.

d.       And then ensure that your mappings are correct.

e.       If you scroll to the bottom you should see the mappings from your Lookups above to your Fact Table

f.         clip_image054

8.       Then finally run your SSIS Package and you should see the following: