SSIS – Creating a process which will check a condition, if it is not ready wait and try again.

So we had a requirement where we wanted to create a waiting for loop, so that if it was not ready it would wait, try again, and if still not ready wait and try again until it got to the Max Number of tries. We also did not want it to error whilst waiting.

Example:

  • Check to see if the database is online.
  • Wait for 1 minute to retry.
  • Try 10 times before failing
  1. The first that we need to do, is to create some Variables which will help with the process.
  2. From the above this is what they are for:
    1. Max_Count – This is how many times you want to retry
    2. State – This is our check to see if the databases are online or not.
    3. SuccessfulRun – This is used to see if it ran successfully
    4. Value_Counter – This is used as the counter.
  3. Then below are the steps.
    1. The first thing is to assign a value to our Max_Count Variable.
    2. This is done using an Execute SQL Task and assigned it with the following below:
      1. As you can see above in the SQLStatement we have set our Max_Count to 10
      2. Then in the ResultSet we have mapped our results to our Variable
    3. Next is where we configured our For Loop Container with the details shown below:
      1. As you can see above we have assigned it with our Value_Counter Variable.
      2. You will note that in the EvalExpression we have got our Max_Count variable as well as the SuccessfulRun
        not
        equal to zero. This will be explained later how this value is derived.
        1. NOTE: This is also required so that it will exit when it is actually
          successful.
    4. Next in our process within the For Loop is where we start with our Waiting Period as shown below using the Execute SQL Task
      1. All that we are doing here is using a TSQL
        statement to wait for 1 minute.
      2. NOTE: The reason we do this at the start, is so that if it has to loop through, it will wait on the second
        loop before checking.
    5. Next is where we have our condition (which based on our example is to check if the database is online) as shown below:
      1. What our SQLStatement is doing above is checking to see the State of our database.
        1. And if it is online it will have a state of equal to zero
      2. Then in the ResultSet we have mapped our results to our Variable
    6. Next is where we have our Script Task which we use to populate the SuccessfulRun variable as shown below:
      1. We went into Edit the script with the following below:
      2. As you can see above we also assigned the same variable
        SuccessfulRun to the ReadWriteVariables
      3. NOTE: We did use the Microsoft Visual C# 2012 as the script
        language.
      4. From the above all that we added was to give the Variable SuccessfulRun = 1
    7. And then we set our database
      back into Multi User Mode after it was successfully back online as shown below:
    8. The final thing that we had to do, which actually ensure that it loops through without
      erroring is to put in a precedence
      constraint
      between the Check if Database is Online and Check to see if Successful as shown below:
      1. We then configured it with the following for the precedence
        constraint as shown below:
  4. So once this is all done it looks like the following:
  5. You can now run this, potentially
    changing your condition (Check if Database is Online) that you want to check for. As well as you can also change how long you want to wait for before
    checking
    again.

Automating running PowerShell script which required Administrator elevation via SSIS

What we are going to explain below is the following via an example.

Example – We are required to run a PowerShell script which will run with elevated privileges.

So in order to complete this we have to complete the following below but use a simple PowerShell script which will get all the FileNames from a Folder.

  • Create a BAT file which can execute the PowerShell script.
  • Create a Scheduled Task which in turn will call the BAT file.
  • Create an SSIS Package which will call the Scheduled Task

NOTE: The reason that we did this above, was to ensure that we could automate the snapshotting of the database using SSIS. This would enable us to get a snapshot of the data on a required schedule.

Changing the UAC

Below is a quick step to disable the UAC, as I think that this might affect the later steps when the BAT file is trying to run the PowerShell script.

NOTE: You will need to be an Administrator on the Server to turn this off.

  1. Go into the Control Panel.
  2. Then search for UAC.
  3. Then click on the Change User Account Control settings as shown below:
  4. Change it to Never Notify as shown below:
  5. Then click
    Ok
  6. It will ask you one last time if you want to make the change to the Computer.
    1. Click on Yes

Creating the BAT File which can execute the PowerShell script

In the steps below we are going to explain how we create the BAT file which executes the PowerShell script. And also show that at first we can run the PowerShell script manually to ensure that it works.

Testing the PowerShell Script

  1. The first thing to test is to ensure that the PowerShell script actually runs and completes as it should.
  2. Click on Start and then type in PowerShell and from the search results right click on Windows PowerShell ISE and select Run as Administrator
  3. Now when the PowerShell ISE opens you can see from the Window at the top that you are running as Administrator
  4. If you want to follow our example you can then copy and paste the following PowerShell script

    Get-ChildItem
    -Path
    ‘C:\Program Files\Microsoft SQL Server\MSSQL11.PRODBI\MSSQL’
    -Recurse
    |
    Export-Csv
    -Path
    ‘C:\Program Files\Program Files.csv’
    -Encoding
    ascii
    -NoTypeInformation

    1. What we are doing above is to get the File Names for all the files and folders under Program Files and then our SQL Server MSSQL Folder.
    2. And then export it to a CSV.
    3. As with our example once run you should see the CSV file as shown below:
  5. Now we know that our script runs we can now save it to a location on our Server.
  6. As with our example we saved it to a new folder on the C:Drive
  7. We will now use this in the next steps to create the BAT File.

Creating the BAT file which will run the PowerShell script.

In the steps below we will be creating the BAT file which will run the PowerShell script.

  1. Open up Notepad or Notepad++ and copy in the following, which will be explained below:
    1. c:\windows\system32\WindowsPowerShell\v1.0\powershell.exe
      -ExecutionPolicy ByPass -command
      “. ‘C:\XIOSnapshot\Export-MSSQL-Files.ps1′”
    2. What we are doing above is to first put in the full location of PowerShell highlighted in GREEN
    3. Next we are putting in some parameters for PowerShell, which is to bypass the execution policy and the command that is to be run highlighted in PURPLE
    4. And then the final part is where we are calling the PowerShell script which we created above. It is highlighted in RED
      1. NOTE: In order to run the command, we have to put it all in double quotes, as well as it has to have a period in order for it to be executed.
    5. Now we can save this BAT File in the same location with a name.
    6. As per our example we saved it with the following:
  2. Now what we will need to do is to run the BAT file from the command line to test that it all works.
  3. You can now open command prompt and run the bat file as we have shown with our example below:
    1. NOTE: You can see that we are not logged into the Command prompt as Administrator.
    2. We can also see that it completed successfully.
    3. And finally if we look on the file system we can see the file got created.
  4. One final thing to note, is that this will run when we execute the BAT file as we have administrator privileges.

Creating the Scheduled Task which will call the BAT File

Now we are going to create the Scheduled Task which will call the BAT File.

  1. Go into Server Manager, then click on Tools and select Task Scheduler
  2. Once this opens on the right hand side click on Create Task…
  3. The first thing that you have to give your task is a name
    1. As with our example we are going to give it the name as shown below:
    2. Now a VERY important step
      is under Security options make sure to select the Tick box as shown below:
      1. NOTE: With this it means that it will run with the highest privileges and will run as Administrator context
    3. Next if you are going to run this via SQL Server Agent, you are going to have to put in the same domain account
      context that SQL Server Agent Runs Under.
      1. NOTE: When you click ok for the creation of the Scheduled Task it will prompt for the password.
  4. You can leave Triggers blank, because we are going to execute this via SSIS.
  5. Click on Actions
    1. Then click New
    2. Then configure the New Action as shown below, with an explanation afterwards.
      1. Here we are just running our BAT File.
    3. Click Ok.
  6. You can now click Ok for this Scheduled Task to be created.
  7. NOTE: When you click ok for the creation of the Scheduled Task it will prompt for the password.
  8. The final step is to then test the scheduled task by running it within Task Scheduler.
    1. Click on Run
    2. You will briefly get the command prompt Window popping up, which you will not see when running via SSIS.
    3. To confirm this you can view the Last Run Result in Task Scheduler as shown below:
    4. As well as the file on the file system:
  9. Now we are done with the Task Scheduler, calling the BAT file, which is then calling the PowerShell script.

Automating the running of the Scheduled Task in SSIS

In the steps below we are going to show how to automate the running of the Scheduled Task within SSIS

  1. Either create a new SSIS Project or create a new SSIS Package.
  2. Then drag in the Execute Process Task into the Control Flow.
  3. Double click to go into the Properties.
    1. We gave it the following name:
    2. Then click on Process to put in the details to call the Scheduled Task
    3. Now what you will see below is the details to run the Scheduled Task which will be explained afterwards:
      1. From the above under
        Executable, this is where the Scheduled Task exe is saved on our Windows
        Server 2012.
      2. Then under Arguments is where we specified for the Scheduled Task to firstly run (/run) and then the Task Name (/TN) to run
      3. Then click
        Ok.
    4. You should then see the following in your SSIS Package:
  4. Now to test that it all works, execute your SSIS
    Package.
    1. It should then run successfully and you should see the following:
    2. NOTE: That the SSIS will come back immediately saying that was successful if the command is correct.
    3. So if you have something else that is reliant on this step I would suggest putting in something to delay it going onto the next step.
      1. I did this by using the WAITFOR DELAY in SQL Server.
    4. Now we can see the file being updated as shown below:
  5. The final step is to then deploy this to your SSIS
    Server and create the scheduled task and test it.
  6. NOTE: If you have configured the Scheduled
    Task with the same
    domain
    account as your SQL Server Agent
    account it will succeed.
    1. NOTE: This domain account should not have a password that will change.

BI-NSIGHT – Power BI Desktop (Date Hierarchy) – Power BI(Timeline Slicer Visual, Pin Excel Range to Dashboard, Power BI User Groups, Weekly Service Update, Visual Studio Application Insights Content Pack) – Microsoft BI Reporting Road Map – SQL Server 2016 SSIS Updates

With the Pass conference completed last week there has been a lot of information about SQL Server 2016 and the future does indeed look bright for BI within the Microsoft BI stack. And I personally think that in the future they will be leading in most of the BI areas.

I am not going to go into all the details as this has been covered in a whole host of other blogs that I follow. And I am sure that a lot of the people reading my blog have already found out all the new amazing news.

So here we go with all updates from the past week.

Power BI Desktop – Date Hierarchy

The above screenshot was taken from Jen Underwood, which is showing how in the future version of Power BI Desktop it will have the ability to be able to create the Date Hierarchy for you. I am sure it will be in a not too distance release.

Power BI – Timeline Slicer Visual

As promised as Pass last week here is another great Custom Visual available in Power BI.

I can see this being used a lot, as I have used it in Excel in the past and it does allow users the ability to slice their data by Month to Date, Year to Date, Quarter to Date etc…

You can find all the custom Visualizations here: Power BI Custom Visualizations

Power BI – Pin Excel Range to Dashboard

Another feature that will be coming to the Power BI Service is the ability to Pin an Excel Range into your Power BI dashboard.

I think that this will be really useful, because Excel does some things really well. And often it can show you a lot more information, which can easily be digested instead of trying to replicate it Power BI.

Power BI – User Groups

This is another great incentive, as I think as the momentum grows with Power BI this will be a great way to network with like-minded people.

As well as learn from other people who I have no doubt will have some amazing idea’s and experience to share.

Here are the details if you are interested to Sign up or see if there are people in your area: Power BI User Groups are here!

Power BI – Weekly Service Update

There were some interesting updates this week, which is that you now get a guide in what you want to do in Power BI.

As well as now individuals can also sign up for Power BI. I do think that this is a very clever move. As there are a lot of people who potentially might use it at work, and then want to use it for their personal projects. Along with this you can get the general public to start using this service. And often this can attract a larger crowd than the amount of people that will be exposed to the Power BI service. Which in turn could get Power BI into a company!

And finally is the duplication of an existing report. Which often can help when you want it to be very similar and do not want to have to re-create it all from scratch!

You can find out all the details here: Power BI Weekly Service Update

Power BI – Visual Studio Application Insights Content Pack

This week’s Content Pack is about Visual Studio applications and can give you insight into your applications that you have created and can show you potentially where you have issues.

You can find out how to use the content pack and more details here: Explore your Application Insights data with Power BI

Microsoft BI Reporting Road Map

As has been blogged quite extensively it is the first time since I have started my career in BI, that there has actually been a roadmap for BI from Microsoft.

I have to say it is great that we now have this visibility, because it means we can plan for what is coming. And incorporate some of the new changes into our existing and to be delivered projects. Which means we will be in a position to show the people in our business something that is new and fresh.

And the way that I see it, people like to see things change. Not everyone in the business, but at times even if the charts just change slightly or there is something additional it can mean that there is great adoption. It also shows that it is not something that been developed and never looked at again!

I do feel that they are focusing a lot on SQL Server Analysis Services Tabular. And for good reason, this product is playing catch up. It is also being used in Power BI, which we all really love and are using more often. And I can see that we are also starting to get the best of both worlds. And by that I mean we are getting a lot of the functionality from SQL Server Analysis Services Multidimensional, as well as from TSQL. Which means that we can leverage the best of both.

You can find out all the information about the BI Reporting Roadmap here: Microsoft Business Intelligence – our reporting roadmap

SQL Server 2016 – Integration Services Update

The link below are all the updates from Wolfgang Strasser (w|t) with regards to all the great updates that are coming to SSIS 2016.

I am looking forward to see how the Package Control Flow Templates, as the way I see, this will mean that you can leverage creating the template once, and then reuse it again and again. So for example if you create a Package Control Flow Template for a Slowly Changing Dimension Type 2. You can then use this in your framework for all your other developers.

You can read his blog post here: SQL Server 2016 Integration Services (SSIS) –Summary of SQL Pass Summit 2015 session

The future for me

I was reading through the email from The Databse Weekly and I saw an article from Paul Randal with regards to Paul being your mentor. This really got me thinking and where is my future within Business Intelligence (BI).

I really love what I do. I get great satisfaction and enjoyment when I can tell people a story about their data. Or show them insights into their data for the first time.

I find it easy to grasp new concepts and get it all working. I work entirely within the Microsoft BI toolset.

With the new Office 365 and Power BI offerings there is a whole new world to get my head around. It really excites me.

I have also started working for a BI consulting company and it equally exciting times here. They are looking to expand and really get into the Microsoft BI space. That’s where I come into the picture.

I feel that I am at the right place at the right time. In terms of using my existing knowledge within the Microsoft space, as well as enabling the consultancy to grow from strength to strength going forward.

This is where I feel that the being mentored by Paul Randal could get me onto the next level, which is where I want to go and where I will eventually get to. Being mentored by someone as knowledgeable and experienced means I can get there a little quicker!

So where do I see myself in the future?

I see myself learning and going to that next level. Enabling businesses to gain insights to their data faster and quicker than ever. As well as driving the BI consultancy to become the go to consultancy for Microsoft BI.

Relocating to Australia – BI Job opportunities in Queensland

I thought I would let you guys know that I am about to relocate to Australia from South Africa. I have had an amazing time in South Africa, and learnt a whole lot whilst working at my past employer.

So this is a plug at anyone who has any lead or potential BI work in Queensland, Australia. I would really appreciate it, if you have anything to please email me.

I will be in Australia from 08 August 2014.

Below is a link to my CV and as well as my contact details.

CV-Gilbert Quevauvilliers-2014

Thanks

Gilbert

SSIS – Automating SQL Server Backups, copying files and checking that they were copied

I had an issue where I lost all my backups. So what I have now created is an automated SQL Server backup process, which will back up all the databases, verify the backups, copy them to the required location. And then finally email the backup files copied to ensure that they are all there.

               

Backing up the databases

1.       Below is the script which is created on the master Database.

2.       This backs up the databases all except the TempDB to the desired location.

3.       NOTE: WE ARE ALWAYS USING THE SAME FILENAMES AND OVERWRITING THE CURRENT BAK FILE, BECAUSE WE ONLY NEED THE LATEST COPY.

a.       THIS ALSO SAVES SPACE FOR THE BACKUP FILES.

4.       Here is the example of the script to run to create it, and some notes afterwards

USE[master]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_BackupDatabases]    Script Date: 2013-06-06 01:58:41 PM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— =============================================

— Author:           Gilbertq     

— Create date: 07 June 2013

— Description:     

— =============================================

CREATEPROCEDURE[dbo].[prc_BackupDatabases]

AS

BEGIN

 

       SETNOCOUNTON;

 

 

DECLAREUserDatabases_CTE_CursorCursor

FOR

 

— Selecting user database names.

selectnameasDatabaseName

fromsys.sysdatabases

where ([dbid])<> 2

 

OPENUserDatabases_CTE_Cursor

DECLARE@dbNamevarchar(100);

DECLARE@backupPathvarchar(100);

DECLARE@backupQueryvarchar(500);

 

— make sure that the below path exists

set@backupPath=‘C:\SQLBackups\’

 

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

While (@@FETCH_STATUS<>1)

 

BEGIN

— Backup SQL statement

set@backupQuery=  ‘backup database ‘+@dbName+‘ to disk = ”’+@backupPath+‘SERVERNAME-‘++@dbName  +‘.bak” WITH COMPRESSION,INIT’

 

 

— Print SQL statement

print@backupQuery

 

— Execute backup script

–Select (@backupQuery)

EXEC (@backupQuery)

 

— Get next database

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

END

 

CLOSEUserDatabases_CTE_Cursor

DEALLOCATEUserDatabases_CTE_Cursor

 

 

SETNOCOUNTOFF;

END

 

GO

a.        NOTE: You will have to modify the following in the above script for it to run on your environment:

                                                               i.      @backupPath

1.       Change this to where you want to back up your SQL BAK Files.

                                                              ii.      Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name

1.       This was because we were backing up multiple BAK files from multiple servers.

b.       This is what it looks like in SSIS

c.        clip_image002[4]

 

Verify the SQL Server Backups

1.       The next step is to then verify that the backups are consistent and can be restored.

2.       Again we created this script on the Master database

3.       Here is the create script:

USE[master]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_VerifyDatabases]    Script Date: 2013-06-06 02:09:05 PM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— =============================================

— Author:           Gilbertq     

— Create date: 07 Jun 2013

— Description:     

— =============================================

CREATEPROCEDURE[dbo].[prc_VerifyDatabases]

AS

BEGIN

 

       SETNOCOUNTON;

 

 

 

DECLAREUserDatabases_CTE_CursorCursor

FOR

 

— Selecting user database names.

selectnameasDatabaseName

fromsys.sysdatabases

where ([dbid])<> 2

 

OPENUserDatabases_CTE_Cursor

DECLARE@dbNamevarchar(100);

DECLARE@backupPathvarchar(100);

DECLARE@backupQueryvarchar(500);

 

— make sure that the below path exists

set@backupPath=‘C:\SQLBackups\’

 

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

While (@@FETCH_STATUS<>1)

 

BEGIN

— Backup SQL statement

set@backupQuery=  ‘Restore VERIFYONLY from disk = ”’+@backupPath++‘SERVERNAME-‘+@dbName  +‘.bak” ‘

 

— Print SQL statement

print@backupQuery

 

— Execute backup script

–Select (@backupQuery)

EXEC (@backupQuery)

 

— Get next database

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

END

 

CLOSEUserDatabases_CTE_Cursor

DEALLOCATEUserDatabases_CTE_Cursor

 

 

SETNOCOUNTOFF;

END

 

GO

a.        NOTE: You will have to modify the following in the above script for it to run on your environment:

                                                               i.      @backupPath

1.       Change this to where you backed up your SQL BAK Files.

                                                              ii.      Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name

1.       This was because we were backing up multiple BAK files from multiple servers.

b.       This is what it looked like in SSIS

c.image

 

Copying files to backup location on the network

1.       Here we are then copying our backup files to a network location where they are then backed up.

2.       It is a simple File System Task, where we are copying all the files from one folder to another folder.

a.        The only thing to NOTE is that we are overwriting the files when we copy them over.

3.       This is what it looks like, simple to setup and create.       

image

 

 

Getting the file list and emailing it to the required users

1.       The final part is where I want to verify that the backups were backed up and copied over to the network location.

2.       The first part is using the PowerShell script to get the file listing into CSV.

a.        NOTE: I created a mapped drive to our backup location in order to make it easier in the script.

                                                               i.      And to also ensure that there is no authentication issues.

3.       All the files were saved in a Folder called PowerShell

4.       This is the PowerShell script that was used to get the file listing and exported to CSV

Get-ChildItem r:\*.*  -include SERVERNAME*.* | select name,length,LastWriteTime  | Export-Csv C:\SQLBackups\Powershell\SERVERNAME-BackupFileList.csv

a.        This was saved with the following filename:

                                                               i.      Filelisting_SERVERNAME.ps1

5.       Next this is how we configured it to run the PowerShell script in SSIS

a.        Drag in an Execute Process Task and configure it with the following:

                                                               i.      We gave it the following name:

1.       PowerShell to get Directory Listing for SERVERNAME Files

                                                              ii.      Then click on the Process on the left hand side.

                                                            iii.      This is where you actually configure how you will run the Execute Process

1.       Where it says Executable you have to put in the location for your executable.

2.       In our example this is the location for PowerShell

C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe

3.       Next is the Arguments that you want to pass with your executable

4.       In our example we put in the PowerShell script to call:

-ExecutionPolicy ByPass -command “. ‘M:\SQLBackups\Powershell\Filelisting_SERVERNAME.ps1′”

b.       You can leave all the other defaults so that it looks like the following below:

image

6.       The next step is to then go through the process of Importing the data from the CSV file into the table called:

Staging.tb_BackupFileListing

a.        NOTE: In this table we store when the file was modified as well as when data was imported into the table.

7.       We then import this data into the Fact Table so that we have got the history stored, into the table called:

a.       TF_BackupFileListing_Converted

8.       We then export just the current Backup File Listing data into an Excel Spread sheet which is saved locally into the PowerShell Folder.

9.       The final step is where we then attach the exported spread sheet and email it to the required people.

image

10.    This is what this package looks like:

a.image