How many times has my Power BI dataset been refreshed?

Have you ever wondered how many times your Power BI dataset has been refreshed? And if you are using Power BI Pro, you are limited to 8 refreshes a day.

This week’s blog post I am going to demonstrate how you can use the Power BI API to view all your data refreshes for a particular dataset.

If you need a recap here is a link to my previous blog post Using the Power BI API with PowerShell scripts, in which I demonstrated how to refresh your Power BI dataset using PowerShell and the Power BI API.

If you are looking for information on how to get the GroupID, DatasetID & ClientID, please refer to my previous blog post Using the Power BI API with PowerShell scripts

Example

For this particular example, I am busy working on a presentation for SQL Saturday Sydney (which I would recommend going to, if you are in Sydney on 07 July 2018)

Part of what I am doing is refreshing the data using an Azure Function, Microsoft Flow and Power Apps (Which will be another blog post), and I wanted to know how many times I have refreshed the dataset.

I need to know if I am going to hit my 8 times a day refresh limit when testing! This PowerShell Script and Power BI report came in handy as it allowed me to know if I was going to exceed my refresh limit.

As with all examples you can download the PowerShell file here: PUBLIC – Power BI Dataset Refresh with Output to CSV.ps1

NOTE: Before starting you will have to have updated the PowerShell script where YOU have put in your GroupID, DatasetID and ClientID

Updating PowerShell Script for Data Refreshes

The only actual change that I made was to put in the location of where I wanted to store my CSV File.

This is on line 93

Once I had updated the above to a folder and file location I ran the script.

Below is what it looked like once I opened the CSV File

It is interesting to see that it also tells you what Refresh Type was done. In this instance I was doing it via an Azure Function which was making the API call.

Power BI Report with output from CSV

What I then did was I saved the CSV file into my OneDrive for Business folder location. This enabled me to load the CSV file into an existing Power BI report.

As you can see below, I now know how many refreshes I have completed on my dataset.

Wrapping Up

I have demonstrated how I leveraged the Power BI APIs to download how many refreshes my dataset has had.

And then being able to report on this allowed me to be able to know how many refreshes I had left.

Once again here is the PowerShell Script that I used: PUBLIC – Power BI Dataset Refresh with Output to CSV.ps1

If there are any comments or suggestions, please leave them in the comments below.

Bonus – If you are interested in what the PowerShell Script is doing?

Below are additional details on how I got the data down from the Power BI API call

  • Lines 82 – 83
    • This is where I am creating the URI and sending it to the Power BI API
  • Line 85
    • This is where I am getting the output from the API Call and converting it to Json
  • Line 87
    • I am then converting it back from Json. This might seem a bit counterintuitive, but my logic is when converting it from Json I am then able to Query the Json.
  • Line 89
    • Because I am converting it from Json, I am now able to select specific Json properties.
    • On line 89 I am then selecting the “value” property, which is the data that is returned from the Power BI API call.
  • Line 91
    • I am selected the specific items from the “value” property
    • This has all the details that I require for my data refreshes.
  • Line 93
    • I am then exporting the data to a CSV file.

Using the Power BI API with PowerShell scripts – Refreshing your dataset

I know when I started trying to use the Power BI APIs, I was unsure of what I was doing, and how I could get it working.

Not only that, but I could see that as with building any cloud platform the APIs have to be there before they can put the skin on and make it easy to use.

There is always new features or functionality that is available via the Power BI APIs.

In this series of Blog posts, I am going to show how you can use additional features in Power BI by using the API and PowerShell.

Let me start by saying that I am no coding or scripting expert.

My goal in this blog post, is to show that using the PowerShell scripts can be fairly easy, once someone (me, choose ME) has shown you how to put in the required details.

Example

As always, I like to use an example, which I find makes it easier to follow.

In this example, I am going to be using the Power BI Refresh API to refresh a dataset using PowerShell.

My dataset will be “Fourmoo Google Analytics”

Getting the details required for the PowerShell Script

I got the details on how to refresh a Power BI Dataset from the following location below. Here it explains what the Power BI API expects to receive in order to refresh the dataset.

https://msdn.microsoft.com/en-us/library/mt784652.aspx

I then downloaded the PowerShell Script from here: HTTP

Would you believe that there are only 3 pieces of information required in order to update the PowerShell Script and get it running?

Below I will explain how to get this data and put it onto the PowerShell Script

Registering for a ClientID

The first step I had to do was to register for a ClientID

  • To do this I went to the following URL:
  • https://dev.powerbi.com/apps
  • If I was not logged in I would be prompted to log in with my Power BI Account.
  • I put in the following below in order to get the ClientID that I need in my PowerShell script.
  • I then clicked on Register App
  • Once that completes I then got my ClientID
    • It will be displayed in the ClientID section below, which I then copied and pasted and stored in a secure location.

GroupID

NOTE: Group is what it was called in Power BI before it was changed to App Workspaces. So when anything refers to Group, it is actually now referring to an App Workspace.

This is the Group where your dataset currently is in.

To find the GroupID you can complete the following steps below.

  • Log into the Power BI Service, and then I went to the App Workspace where I have the dataset that I want to refresh.
  • In my example I went into the “Site Analytics” Workspace
  • Then in the URL you will see that it will say groups, and the section immediately after that is the GroupID
    • As you can see above my GroupID starts with 0084
    • Make a note of the GroupID

DatasetID

Next is where I needed to find the Dataset ID, which I show in the steps below.

  • I made sure I was already in my Site Analytics App Workspace.
  • Then I clicked on Settings and then Settings and then Settings again
  • I then clicked on Datasets, and then clicked on the dataset that I want to refresh.
    • As in my example it was called “Fourmoo Google Analytics”
  • Now in the URL I now had my DatasetID which is which is after datasets, and highlighted as shown below.
  • I then made a note of the DatasetID

Updating the PowerShell Script

Now I am ready to update the PowerShell script with the details I got from above.

NOTE: The PowerShell script does have a lot of documentation in it, so if you get lost there is a lot of comments to assist you.

  • I opened the PowerShell ISE
  • I then opened my PowerShell script.
  • Now where it says $groupID I put in my GroupID that I had noted earlier.
  • Then where it said $datasetID, I put in my DatasetID that I had noted earlier.
  • The final piece is to put in the $clientID, which I had noted earlier
  • Then I saved the PowerShell Script file

Testing the PowerShell script and my dataset refresh

The final step is to now run the PowerShell script, ensure that it runs, and my dataset actually gets refreshed.

  • As you can see my last refresh is shown below
  • I then run the PowerShell Script.
  • I got a prompt asking for the App Permissions
    • I clicked Accept
  • Then I got the following result showing it was successfully executed
  • I could now see that it was In progress
  • And finally, it was successfully refreshed.

Conclusion

Whilst this has been a bit of a longer blog post, I do hope that you can see that if you follow the steps how you can leverage the Power BI APIs for your datasets.

I do plan on doing additional blog posts on the different Power BI APIs. If there are any that you are interested in, please leave a comment below.

And as I say in every blog post, please leave any comments or questions.

Link to the PowerShell Script here: ManageRefresh.ps1

Automating running PowerShell script which required Administrator elevation via SSIS

UPDATE (08 AUG 2016): I have updated this to use a SQL Server Agent Job instead of the Task Scheduler which proved to not be reliable.

What we are going to explain below is the following via an example.

Example – We are required to run a PowerShell script which will snapshot a database using Xtreme IO from EMC. In order for this to run it has to be run as Administrator due to it having to drop and re-create the disk drives in the Operating System layer.

So in order to complete this we have to complete the following below but use a simple PowerShell script which will get all the FileNames from a Folder.

  • Create a BAT file which can execute the PowerShell script.
  • Create a Scheduled Task which in turn will call the BAT file.
  • Create an SSIS Package which will call the Scheduled Task

NOTE: The reason that we did this above, was to ensure that we could automate the snapshotting of the database using SSIS. This would enable us to get a snapshot of the data on a required schedule.

Changing the UAC

Below is a quick step to disable the UAC, as I think that this might affect the later steps when the BAT file is trying to run the PowerShell script.

NOTE: You will need to be an Administrator on the Server to turn this off.

  1. Go into the Control Panel.
  2. Then search for UAC.
  3. Then click on the Change User Account Control settings as shown below:
  4. Change it to Never Notify as shown below:
  5. Then click
    Ok
  6. It will ask you one last time if you want to make the change to the Computer.
    1. Click on Yes

Creating the BAT File which can execute the PowerShell script

In the steps below we are going to explain how we create the BAT file which executes the PowerShell script. And also show that at first we can run the PowerShell script manually to ensure that it works.

Testing the PowerShell Script

  1. The first thing to test is to ensure that the PowerShell script actually runs and completes as it should.
  2. Click on Start and then type in PowerShell and from the search results right click on Windows PowerShell ISE and select Run as Administrator
  3. Now when the PowerShell ISE opens you can see from the Window at the top that you are running as Administrator
  4. If you want to follow our example you can then copy and paste the following PowerShell script

    Get-ChildItem
    -Path
    ‘C:\Program Files\Microsoft SQL Server\MSSQL11.PRODBI\MSSQL’
    -Recurse
    |
    Export-Csv
    -Path
    ‘C:\Program Files\Program Files.csv’
    -Encoding
    ascii
    -NoTypeInformation

    1. What we are doing above is to get the File Names for all the files and folders under Program Files and then our SQL Server MSSQL Folder.
    2. And then export it to a CSV.
    3. As with our example once run you should see the CSV file as shown below:
  5. Now we know that our script runs we can now save it to a location on our Server.
  6. As with our example we saved it to a new folder on the C:Drive
  7. We will now use this in the next steps to create the BAT File.

Creating the BAT file which will run the PowerShell script.

In the steps below we will be creating the BAT file which will run the PowerShell script.

  1. Open up Notepad or Notepad++ and copy in the following, which will be explained below:
    1. c:\windows\system32\WindowsPowerShell\v1.0\powershell.exe
      -ExecutionPolicy ByPass -command
      “. ‘C:\XIOSnapshot\Export-MSSQL-Files.ps1′”
    2. What we are doing above is to first put in the full location of PowerShell highlighted in GREEN
    3. Next we are putting in some parameters for PowerShell, which is to bypass the execution policy and the command that is to be run highlighted in PURPLE
    4. And then the final part is where we are calling the PowerShell script which we created above. It is highlighted in RED
      1. NOTE: In order to run the command, we have to put it all in double quotes, as well as it has to have a period in order for it to be executed.
    5. Now we can save this BAT File in the same location with a name.
    6. As per our example we saved it with the following:
  2. Now what we will need to do is to run the BAT file from the command line to test that it all works.
  3. You can now open command prompt and run the bat file as we have shown with our example below:
    1. NOTE: You can see that we are not logged into the Command prompt as Administrator.
    2. We can also see that it completed successfully.
    3. And finally if we look on the file system we can see the file got created.
  4. One final thing to note, is that this will run when we execute the BAT file as we have administrator privileges.

Using SQL Server Agent Job to run BAT Files

Below is the process which will actually enable you to run your BAT files under the correct CMD.EXE Context

  1. In order to get this to work, we had to use a combination of SQL Server Agent steps and SSIS steps, which is explained below
    1. First we then needed to use our BAT File to run the PowerShell script which would snap the data.
      1. We configured this step in the SQL Server Agent Job properties as shown below:
      2. As you can see from above we change the Type to be “Operation system (CmdExec)”
        1. The reason for this is that this runs the CMD.EXE in the correct context.
      3. And then we put in our BAT file which calls the PowerShell script which we created earlier.
    2. Next we then ran the BAT file which modified the folder permissions.
      1. We configured this step in the SQL Server Agent Job properties as shown below:
      2. As you can see from above we change the Type to be “Operation system (CmdExec)”
        1. The reason for this is that this runs the CMD.EXE in the correct context.
  2. You can then schedule your SQL Server Agent job and as I tested it did run successfully.

Automating running PowerShell script which required Administrator elevation via SSIS

What we are going to explain below is the following via an example.

Example – We are required to run a PowerShell script which will run with elevated privileges.

So in order to complete this we have to complete the following below but use a simple PowerShell script which will get all the FileNames from a Folder.

  • Create a BAT file which can execute the PowerShell script.
  • Create a Scheduled Task which in turn will call the BAT file.
  • Create an SSIS Package which will call the Scheduled Task

NOTE: The reason that we did this above, was to ensure that we could automate the snapshotting of the database using SSIS. This would enable us to get a snapshot of the data on a required schedule.

Changing the UAC

Below is a quick step to disable the UAC, as I think that this might affect the later steps when the BAT file is trying to run the PowerShell script.

NOTE: You will need to be an Administrator on the Server to turn this off.

  1. Go into the Control Panel.
  2. Then search for UAC.
  3. Then click on the Change User Account Control settings as shown below:
  4. Change it to Never Notify as shown below:
  5. Then click
    Ok
  6. It will ask you one last time if you want to make the change to the Computer.
    1. Click on Yes

Creating the BAT File which can execute the PowerShell script

In the steps below we are going to explain how we create the BAT file which executes the PowerShell script. And also show that at first we can run the PowerShell script manually to ensure that it works.

Testing the PowerShell Script

  1. The first thing to test is to ensure that the PowerShell script actually runs and completes as it should.
  2. Click on Start and then type in PowerShell and from the search results right click on Windows PowerShell ISE and select Run as Administrator
  3. Now when the PowerShell ISE opens you can see from the Window at the top that you are running as Administrator
  4. If you want to follow our example you can then copy and paste the following PowerShell script

    Get-ChildItem
    -Path
    ‘C:\Program Files\Microsoft SQL Server\MSSQL11.PRODBI\MSSQL’
    -Recurse
    |
    Export-Csv
    -Path
    ‘C:\Program Files\Program Files.csv’
    -Encoding
    ascii
    -NoTypeInformation

    1. What we are doing above is to get the File Names for all the files and folders under Program Files and then our SQL Server MSSQL Folder.
    2. And then export it to a CSV.
    3. As with our example once run you should see the CSV file as shown below:
  5. Now we know that our script runs we can now save it to a location on our Server.
  6. As with our example we saved it to a new folder on the C:Drive
  7. We will now use this in the next steps to create the BAT File.

Creating the BAT file which will run the PowerShell script.

In the steps below we will be creating the BAT file which will run the PowerShell script.

  1. Open up Notepad or Notepad++ and copy in the following, which will be explained below:
    1. c:\windows\system32\WindowsPowerShell\v1.0\powershell.exe
      -ExecutionPolicy ByPass -command
      “. ‘C:\XIOSnapshot\Export-MSSQL-Files.ps1′”
    2. What we are doing above is to first put in the full location of PowerShell highlighted in GREEN
    3. Next we are putting in some parameters for PowerShell, which is to bypass the execution policy and the command that is to be run highlighted in PURPLE
    4. And then the final part is where we are calling the PowerShell script which we created above. It is highlighted in RED
      1. NOTE: In order to run the command, we have to put it all in double quotes, as well as it has to have a period in order for it to be executed.
    5. Now we can save this BAT File in the same location with a name.
    6. As per our example we saved it with the following:
  2. Now what we will need to do is to run the BAT file from the command line to test that it all works.
  3. You can now open command prompt and run the bat file as we have shown with our example below:
    1. NOTE: You can see that we are not logged into the Command prompt as Administrator.
    2. We can also see that it completed successfully.
    3. And finally if we look on the file system we can see the file got created.
  4. One final thing to note, is that this will run when we execute the BAT file as we have administrator privileges.

Creating the Scheduled Task which will call the BAT File

Now we are going to create the Scheduled Task which will call the BAT File.

  1. Go into Server Manager, then click on Tools and select Task Scheduler
  2. Once this opens on the right hand side click on Create Task…
  3. The first thing that you have to give your task is a name
    1. As with our example we are going to give it the name as shown below:
    2. Now a VERY important step
      is under Security options make sure to select the Tick box as shown below:
      1. NOTE: With this it means that it will run with the highest privileges and will run as Administrator context
    3. Next if you are going to run this via SQL Server Agent, you are going to have to put in the same domain account
      context that SQL Server Agent Runs Under.
      1. NOTE: When you click ok for the creation of the Scheduled Task it will prompt for the password.
  4. You can leave Triggers blank, because we are going to execute this via SSIS.
  5. Click on Actions
    1. Then click New
    2. Then configure the New Action as shown below, with an explanation afterwards.
      1. Here we are just running our BAT File.
    3. Click Ok.
  6. You can now click Ok for this Scheduled Task to be created.
  7. NOTE: When you click ok for the creation of the Scheduled Task it will prompt for the password.
  8. The final step is to then test the scheduled task by running it within Task Scheduler.
    1. Click on Run
    2. You will briefly get the command prompt Window popping up, which you will not see when running via SSIS.
    3. To confirm this you can view the Last Run Result in Task Scheduler as shown below:
    4. As well as the file on the file system:
  9. Now we are done with the Task Scheduler, calling the BAT file, which is then calling the PowerShell script.

Automating the running of the Scheduled Task in SSIS

In the steps below we are going to show how to automate the running of the Scheduled Task within SSIS

  1. Either create a new SSIS Project or create a new SSIS Package.
  2. Then drag in the Execute Process Task into the Control Flow.
  3. Double click to go into the Properties.
    1. We gave it the following name:
    2. Then click on Process to put in the details to call the Scheduled Task
    3. Now what you will see below is the details to run the Scheduled Task which will be explained afterwards:
      1. From the above under
        Executable, this is where the Scheduled Task exe is saved on our Windows
        Server 2012.
      2. Then under Arguments is where we specified for the Scheduled Task to firstly run (/run) and then the Task Name (/TN) to run
      3. Then click
        Ok.
    4. You should then see the following in your SSIS Package:
  4. Now to test that it all works, execute your SSIS
    Package.
    1. It should then run successfully and you should see the following:
    2. NOTE: That the SSIS will come back immediately saying that was successful if the command is correct.
    3. So if you have something else that is reliant on this step I would suggest putting in something to delay it going onto the next step.
      1. I did this by using the WAITFOR DELAY in SQL Server.
    4. Now we can see the file being updated as shown below:
  5. The final step is to then deploy this to your SSIS
    Server and create the scheduled task and test it.
  6. NOTE: If you have configured the Scheduled
    Task with the same
    domain
    account as your SQL Server Agent
    account it will succeed.
    1. NOTE: This domain account should not have a password that will change.

SSIS – Automating SQL Server Backups, copying files and checking that they were copied

I had an issue where I lost all my backups. So what I have now created is an automated SQL Server backup process, which will back up all the databases, verify the backups, copy them to the required location. And then finally email the backup files copied to ensure that they are all there.

               

Backing up the databases

1.       Below is the script which is created on the master Database.

2.       This backs up the databases all except the TempDB to the desired location.

3.       NOTE: WE ARE ALWAYS USING THE SAME FILENAMES AND OVERWRITING THE CURRENT BAK FILE, BECAUSE WE ONLY NEED THE LATEST COPY.

a.       THIS ALSO SAVES SPACE FOR THE BACKUP FILES.

4.       Here is the example of the script to run to create it, and some notes afterwards

USE[master]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_BackupDatabases]    Script Date: 2013-06-06 01:58:41 PM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— =============================================

— Author:           Gilbertq     

— Create date: 07 June 2013

— Description:     

— =============================================

CREATEPROCEDURE[dbo].[prc_BackupDatabases]

AS

BEGIN

 

       SETNOCOUNTON;

 

 

DECLAREUserDatabases_CTE_CursorCursor

FOR

 

— Selecting user database names.

selectnameasDatabaseName

fromsys.sysdatabases

where ([dbid])<> 2

 

OPENUserDatabases_CTE_Cursor

DECLARE@dbNamevarchar(100);

DECLARE@backupPathvarchar(100);

DECLARE@backupQueryvarchar(500);

 

— make sure that the below path exists

set@backupPath=‘C:\SQLBackups\’

 

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

While (@@FETCH_STATUS<>1)

 

BEGIN

— Backup SQL statement

set@backupQuery=  ‘backup database ‘+@dbName+‘ to disk = ”’+@backupPath+‘SERVERNAME-‘++@dbName  +‘.bak” WITH COMPRESSION,INIT’

 

 

— Print SQL statement

print@backupQuery

 

— Execute backup script

–Select (@backupQuery)

EXEC (@backupQuery)

 

— Get next database

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

END

 

CLOSEUserDatabases_CTE_Cursor

DEALLOCATEUserDatabases_CTE_Cursor

 

 

SETNOCOUNTOFF;

END

 

GO

a.        NOTE: You will have to modify the following in the above script for it to run on your environment:

                                                               i.      @backupPath

1.       Change this to where you want to back up your SQL BAK Files.

                                                              ii.      Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name

1.       This was because we were backing up multiple BAK files from multiple servers.

b.       This is what it looks like in SSIS

c.        clip_image002[4]

 

Verify the SQL Server Backups

1.       The next step is to then verify that the backups are consistent and can be restored.

2.       Again we created this script on the Master database

3.       Here is the create script:

USE[master]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_VerifyDatabases]    Script Date: 2013-06-06 02:09:05 PM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— =============================================

— Author:           Gilbertq     

— Create date: 07 Jun 2013

— Description:     

— =============================================

CREATEPROCEDURE[dbo].[prc_VerifyDatabases]

AS

BEGIN

 

       SETNOCOUNTON;

 

 

 

DECLAREUserDatabases_CTE_CursorCursor

FOR

 

— Selecting user database names.

selectnameasDatabaseName

fromsys.sysdatabases

where ([dbid])<> 2

 

OPENUserDatabases_CTE_Cursor

DECLARE@dbNamevarchar(100);

DECLARE@backupPathvarchar(100);

DECLARE@backupQueryvarchar(500);

 

— make sure that the below path exists

set@backupPath=‘C:\SQLBackups\’

 

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

While (@@FETCH_STATUS<>1)

 

BEGIN

— Backup SQL statement

set@backupQuery=  ‘Restore VERIFYONLY from disk = ”’+@backupPath++‘SERVERNAME-‘+@dbName  +‘.bak” ‘

 

— Print SQL statement

print@backupQuery

 

— Execute backup script

–Select (@backupQuery)

EXEC (@backupQuery)

 

— Get next database

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

END

 

CLOSEUserDatabases_CTE_Cursor

DEALLOCATEUserDatabases_CTE_Cursor

 

 

SETNOCOUNTOFF;

END

 

GO

a.        NOTE: You will have to modify the following in the above script for it to run on your environment:

                                                               i.      @backupPath

1.       Change this to where you backed up your SQL BAK Files.

                                                              ii.      Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name

1.       This was because we were backing up multiple BAK files from multiple servers.

b.       This is what it looked like in SSIS

c.image

 

Copying files to backup location on the network

1.       Here we are then copying our backup files to a network location where they are then backed up.

2.       It is a simple File System Task, where we are copying all the files from one folder to another folder.

a.        The only thing to NOTE is that we are overwriting the files when we copy them over.

3.       This is what it looks like, simple to setup and create.       

image

 

 

Getting the file list and emailing it to the required users

1.       The final part is where I want to verify that the backups were backed up and copied over to the network location.

2.       The first part is using the PowerShell script to get the file listing into CSV.

a.        NOTE: I created a mapped drive to our backup location in order to make it easier in the script.

                                                               i.      And to also ensure that there is no authentication issues.

3.       All the files were saved in a Folder called PowerShell

4.       This is the PowerShell script that was used to get the file listing and exported to CSV

Get-ChildItem r:\*.*  -include SERVERNAME*.* | select name,length,LastWriteTime  | Export-Csv C:\SQLBackups\Powershell\SERVERNAME-BackupFileList.csv

a.        This was saved with the following filename:

                                                               i.      Filelisting_SERVERNAME.ps1

5.       Next this is how we configured it to run the PowerShell script in SSIS

a.        Drag in an Execute Process Task and configure it with the following:

                                                               i.      We gave it the following name:

1.       PowerShell to get Directory Listing for SERVERNAME Files

                                                              ii.      Then click on the Process on the left hand side.

                                                            iii.      This is where you actually configure how you will run the Execute Process

1.       Where it says Executable you have to put in the location for your executable.

2.       In our example this is the location for PowerShell

C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe

3.       Next is the Arguments that you want to pass with your executable

4.       In our example we put in the PowerShell script to call:

-ExecutionPolicy ByPass -command “. ‘M:\SQLBackups\Powershell\Filelisting_SERVERNAME.ps1′”

b.       You can leave all the other defaults so that it looks like the following below:

image

6.       The next step is to then go through the process of Importing the data from the CSV file into the table called:

Staging.tb_BackupFileListing

a.        NOTE: In this table we store when the file was modified as well as when data was imported into the table.

7.       We then import this data into the Fact Table so that we have got the history stored, into the table called:

a.       TF_BackupFileListing_Converted

8.       We then export just the current Backup File Listing data into an Excel Spread sheet which is saved locally into the PowerShell Folder.

9.       The final step is where we then attach the exported spread sheet and email it to the required people.

image

10.    This is what this package looks like:

a.image