SSIS – Configuring SSIS Package to run job after data loading has completed

  • In our example below, we are going to be using a SharePoint 2013 Data Alert job which was created in SQL Server Agent, and then insert this into an existing
    SSIS
    Project.
  • The reason for doing this, is so that every time our SSIS Project runs, as part of the process, it can then fire off a SQL Server Job.
  • Essentially in the past the SQL Server Agent Jobs run
    individually and are not tied into when our data has loaded.
  • By doing it in this manner, it enables the related job to be run after every data load. Which then enables the business to get the relevant data at the relevant times.

     

NOTE: You can modify this for any SQL Server Job.

Find the related SQL Server Agent Job Name

Below details how to find the related SQL Server Agent Job name. As typically when you create a SharePoint Data Alert, or SQL Server Reporting Services (SSRS) Data
Driven Subscription or Email Subscription, it creates the job name with a GUID.

This makes it rather difficult to find the correlating Job Name.

  1. The easiest way to find the SQL Server Agent Job Name, is once you have created your SharePoint Data Alert, or SSRS (Data Driven Subscription or Email Subscription) is to go directly into your SQL Server Agent Job Monitor where the Job was created.
  2. Then open the Job Activity Monitor
  3. Then scroll through the list until you find a job with the following
    properties
    1. The name will be in a GUID format:

    2. The Last Run Outcome is set to Unknown
      1. And the Last Run is set to never

    3. NOTE: If there is more than one job with the above properties, then look to see when you scheduled the start time of your
      job.
      1. If that still does not help you might have to go into the job
        properties and into the schedules to see how often it runs to find the correct job.
  4. If this is in a Test
    environment, you can try and run the job to ensure that you do have the correct job.
  5. Make a Note of the Name of the Job
    1. In our example it was: 68CAE336-3D38-42A6-9526-F32F4D501AA4

 

Modifying the SQL Server Agent Job

Below are the steps to modify the SQL Server Agent job, so that it will not run on a schedule.

NOTE: The reason we are doing this is because we want the job to be run from within our SSIS Project after it has loaded our relevant data.

  1. Go into the properties of your SQL Server Agent Job and click on the Schedules
    Page

  2. Then in the Schedule
    List
    click on the Schedule and select
    Remove.
  3. Once completed there will now not be any schedules for the job.

  4. Then click Ok.

 

Creating and configuring your SSIS Package to run the SQL Server Agent Job for you

Below are the steps to create and configure your SSIS package to run the SQL Server Agent Job as part of the SSIS Project.

  1. Create your new SSIS Package.
  2. Drag in an Execute SQL Task
  3. We renamed our Execute
    SQL
    Task to the following:

    Run SharePoint Data Alerts


  4. Then we went into the properties.
    1. We then put in our Connection to our SQL Server Database, which is the same as the where the SQL Server Agent Job is
    2. Then under the SQL Statement we put in the following:

      EXEC msdb.dbo.sp_start_job N’68CAE336-3D38-42A6-9526-F32F4D501AA4′ ;

      1. NOTE: The thing to note here is that we are using the MSDB
        Stored
        Procedure to start our Job.
        1. And that we are using the Job
          name which we noted in previous steps.
  5. Now that you have completed your Execute SQL Task, it is time to test it.
  6. Right click and select Execute SQL Task.
  7. If successful it should come back looking like the following below:

  8. As well as actually getting the email in your Inbox, as is the case with our SharePoint Data Alert.

For SQL Server 2014 – Use the following link below to use an oData Source to get data from SharePoint List into your table

Below are the steps if you want to use SQL Server 2014 to get data from a SharePoint list into your database.

The URL below explains how to download and install the OData Source for SQL Server

1.       Click on the following link below

a.       http://www.microsoft.com/en-us/download/details.aspx?id=42295

2.       Then once you load the page click on Download

a.       This will then bring up which components you want to download.

b.       Scroll down and select the following:

c.        clip_image001

3.       Then download and install the component.

a.       You can just accept all the defaults in the Wizard.

4.       Then if you now open SSDT, and drag in a new Data Flow Task, then click on the Common Section you should see the following:

a.       clip_image002

5.       NOTE: We would want to use the X86 version for our testing in SSDT, due to SSDT being a x86 version.

6.       NOTE II: You will also have to install the component on your SSIS Server and configure your SSIS package to run in 32bit mode, so that when it is run as part of the job in SSIS it will function correctly.

Configuring your OData Source in SSIS

NOTE: You might have to create your SQL Table, to get the data from your SharePoint List into the table before you complete the steps below. Or you can create the table after step 2 below.

 

1.       Once you have followed the steps above, the one thing to note is how to get the correct URL so that we could create our connection to our Lists

a.       With the example below our URL was:

http://Server/sites/testingdemo

b.       Now in order to get the URL for the lists, we had to put in the following:

                                                               i.      clip_image003

                                                             ii.      NOTE: In your Site, you just need to add the following:

/_vti_bin/Listdata.svc

                                                            iii.      Also ensure the account that you use has access to the SharePoint List.

c.        And then you have your connection completed.

d.       After which you can then select your List from the drop down in your

2.       And then we configured our OData Source with the following:

a.       clip_image004

b.       Under Use collection or resource path, ensure that you select Collection.

c.        Under Collection, click on the drop down, and ensure that you select your SharePoint List that has been created in SharePoint

                                                               i.      In our example it was called: CrimeReportingSubscriptions

d.       You can then click on Preview to see that you are getting the expected data.

e.       Then if you click on Columns you will then see all the SharePoint related columns, as well as the columns from your List

f.         clip_image005

g.       NOTE: Ensure that you do have some data already in your list.

h.       Then click Ok to have your OData Source completed.

3.       Then once completed it will look like the following in your Data Flow

SSRS – Microsoft sql server fitler table/matrix with 2nd row of data

What I wanted to do, was to create multiple table/matrix, but to have them separate. And then to have 5 of them in total. But I wanted to have the top 5. So I wanted the first one to be the top value, then top 2, top 3, top 4 and top 5.

This is what the result looks like:

clip_image002[5]

1. Create your first table/matrix which will be filtered as shown below:

a. clip_image004[5]

2. Then copy and paste your first table/matrix and complete the filter like this to show ONLY the second one or top 2

a. clip_image006[5]

b. The thing to note here is that for the Top N value I put 2

c. And then for the Bottom N I put 1

3. Then to do this for the third or top 3, do the following:

a. Copy the above table/matrix and complete the filter as follows:

b. clip_image008[5]

c. The thing to note here is that the Top N was changed to 3

4. NOTE: For any additional table/matrix change the top N to the value you want.

Error: An Item with the same key has been already added – Unable to load the tables in the Power Pivot Window

NOTE: First thing to do is to make a copy of your Excel Workbook, before making any changes to your Power Query Source. If you do NOT, and change your Power Query source, your Power Pivot Model could become unusable!

1.      I had an issue where if I changed the Power Query source from a CSV after I made changes to my data in my Data Model, once saving the Excel workbook, and then going back in I would get the following error:

a.      clip_image002

2.      So what I did to try and fix it was to do the following

a.      Get the data into Power Query correctly, so that it was working as expected.

3.      Then went into the Loading of the Power Query data to the Data Model.

a.      Right click and selected Load To:

                                                    i.     clip_image004

b.      Then un ticked both options

                                                    i.     clip_image006

                                                   ii.     You will then get prompted with the following below:

                                                  iii.     clip_image008

                                                 iv.     Click Continue

c.      It will then come back saying that the Load is Disabled:

                                                    i.     clip_image010

d.      Now go back and again Right click and selected Load To:

                                                    i.     clip_image004

e.      This time select the following:

                                                    i.     clip_image012

f.       Now this will load your data into the Data Model.

4.      Next you can open Power Pivot

5.      What I did next was to rename my Sheet which I am hoping will break the association with the old sheet that was in the Data Model.

a.      Renamed it from:

                                                    i.      clip_image014

b.      To:

                                                    i.      clip_image016

6.      I then went and saved my Excel Workbook, closed it. Opened it again and went into Power Pivot.

a.      So far so good.

7.      I then went back and renamed the sheet in the Power Pivot Data model, back to the Original Name

8.      And I got the same error:

a.      clip_image017

9.      So currently the only way I can see to get this to work, is to keep my new Name.

10.   Which means having to re-do or add my fields to the existing reports.

11.   NOTE: Reading further down the error, it talks about Synonyms, so my educated guess is that it has something to do with the Synonym data, not allowing changes to be made within the Data Model.

SSIS – Using Date and Time Cache Lookups for faster inserts into Fact Tables

What happened is that I found that when doing inserts on larger data sets, it would take a long time to complete the date and time joins. If I excluded this the data would be returned a lot quicker.

 

So my solution to the problem was to use the Cache Transform and then the Loopkup Transform in the SSIS Data flow task. This enabled the query to be returned a lot quicker.

 

Example:

·         We are first going to create our Cache Transform for our Date and Time

·         Then we are going to insert data into our Fact table from the following Adventure Works table below using the Lookup Transformation

o    dbo.FactInternetSales

·         We will be using SSDT and SQL Server 2014

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

Creating the Cache Transform for Date and Time

NOTE: We inserted a time table into the AdventureWorksDW2012 database, which has every minute for a day which will return 1440 rows.

 

NOTE II: You will only have to create the Date and Time Cache Transform once. Unless your Date table changes every time you load your data. This is because in the later steps you will see how you just select your file.

 

1.       We will first start with setting up the Date Cache Transform.

a.        Go into your SSIS and create a new SSIS Package.

b.       Then drag in a data flow task, which we renamed to Date Cache Transform

c.        Then double click to go into the Date Cache Transform data flow task.

d.       Now in the SSIS Toolbox, drag in the ADO NET Source

                                                               i.      NOTE: The reason we use the ADO NET Source, is because it is better at getting the correct Data Types for our Cache Transform.

e.       We then configured it with the following as shown below

                                                               i.      clip_image002

                                                              ii.      NOTE: The reason that we converted our FullDateAlternateKey to Date, is because when we do our lookup later, we want it to have JUST the Date Data Type

f.         If we went into the Advanced Editor for the ADO NET Source, we can verify that the FullDate is the Data type of “Database Date

                                                               i.      clip_image004

g.        Now in the SSIS Toolbox, click under Other Transformations and drag in the Cache Transform

                                                               i.      clip_image006

h.       Then drag the blue constraint to your Cache Transform.

i.         Then double click your Cache Transform to go into the Properties

j.         Now when it opens you will see that there is no Connection Manager. Click on New

                                                               i.      clip_image008

k.        This will then open the Cache Connection Manager Editor

l.         We then configured it as shown below:

                                                               i.      clip_image010

                                                              ii.      As you can see above we put in the name.

                                                            iii.      Then we also selected to Use file cache.

1.       Then we configured the File name with a UNC

2.       NOTE: The reason that we did this is so that if you execute this package from another location it will still find the location.

3.       NOTE II: You must ensure that the account running the SSIS job has access to the folder location and file on the server.

m.      Then click on Columns. This is where you will configure which column will be used for the actual lookup.

                                                               i.      In our example we want to do the lookup on the FullDate, so next to FullDate we will put the Index Position of 1

                                                              ii.      NOTE: You can have more than one lookup, and each additional Index Position will increment in number.

                                                            iii.      clip_image012

n.       Then click Ok to go back to the Cache Transformation Editor.

o.       Now click on the Mappings in the left hand side

p.       You should now see the following:

q.       clip_image014

r.        NOTE: From above you can see because we created the Index Position of 1 for the FullDate, it has got the magnifying glass. Which represents that this can be used in the Lookup later.

s.        Click Ok.

t.         You can now run this data flow task to populate your Date Cache Transform.

u.       clip_image016

2.       Next will be setting up the Date Cache Transform.

a.        Go into your SSIS and create a new SSIS Package.

b.       Then drag in a data flow task, which we renamed to Time Cache Transform

c.        Then double click to go into the Time Cache Transform data flow task.

d.       Now in the SSIS Toolbox, drag in the ADO NET Source

                                                               i.      NOTE: The reason we use the ADO NET Source, is because it is better at getting the correct Data Types for our Cache Transform.

e.       We then configured it with the following as shown below

                                                               i.      clip_image018

f.         If we went into the Advanced Editor for the ADO NET Source, we can verify that the Time is the Data type of “database time with precision

                                                               i.      clip_image020

g.        Now in the SSIS Toolbox, click under Other Transformations and drag in the Cache Transform

                                                               i.      clip_image006[1]

h.       Then drag the blue constraint to your Cache Transform.

i.         Then double click your Cache Transform to go into the Properties

j.         Now when it opens you will see that there is no Connection Manager. Click on New

                                                               i.      clip_image021

k.        This will then open the Cache Connection Manager Editor

l.         We then configured it as shown below:

                                                               i.      clip_image023

                                                              ii.      As you can see above we put in the name.

                                                            iii.      Then we also selected to Use file cache.

1.       Then we configured the File name with a UNC

2.       NOTE: The reason that we did this is so that if you execute this package from another location it will still find the location.

3.       NOTE II: You must ensure that the account running the SSIS job has access to the folder location and file on the server.

m.      Then click on Columns. This is where you will configure which column will be used for the actual lookup.

                                                               i.      In our example we want to do the lookup on the Time, so next to Time we will put the Index Position of 1

                                                              ii.      NOTE: You can have more than one lookup, and each additional Index Position will increment in number.

                                                            iii.      clip_image025

n.       Then click Ok to go back to the Cache Transformation Editor.

o.       Now click on the Mappings in the left hand side

p.       You should now see the following:

q.       clip_image027

r.        NOTE: From above you can see because we created the Index Position of 1 for the Time, it has got the magnifying glass. Which represents that this can be used in the Lookup later.

s.        Click Ok.

t.         You can now run this data flow task to populate your Time Cache Transform.

u.       clip_image029

3.       Now if you are using SSIS 2012 or 2014, we will convert the Date and Time Cache Connections to Project connections.

a.        This is so that later we can select them from our other SSIS Packages.

b.       Right click on each Cache Connection Manager and select Convert to Project Connection

                                                               i.      clip_image031

c.        They will now look like the following in the Connection Managers

d.       clip_image033

 

Creating your TSQL Query so that it can be used for the Lookup Transformation

In the steps below we need to ensure that we create the correct data types in our TSQL Query so that when we do the lookup it will map correctly.

 

NOTE: If the data types within SSIS are not an exact match between your TSQL Query and the Lookup Transformation it will not allow you to create the mapping.

 

1.       Below is the TSQL Example from our dbo.FactInternetSales table.

SELECTTOP 1000 [ProductKey]

      ,[OrderDateKey]

      ,[DueDateKey]

      ,[ShipDateKey]

      ,[CustomerKey]

      ,[PromotionKey]

      ,[CurrencyKey]

      ,[SalesTerritoryKey]

      ,[SalesOrderNumber]

      ,[SalesOrderLineNumber]

      ,[RevisionNumber]

      ,[OrderQuantity]

      ,[UnitPrice]

      ,[ExtendedAmount]

      ,[UnitPriceDiscountPct]

      ,[DiscountAmount]

      ,[ProductStandardCost]

      ,[TotalProductCost]

      ,[SalesAmount]

      ,[TaxAmt]

      ,[Freight]

      ,[CarrierTrackingNumber]

      ,[CustomerPONumber]

      ,[OrderDate]

      ,[DueDate]

      ,[ShipDate]

      ,[DistinctCustomers]

      ,Convert(date,[OrderDate])as DateForCache

      ,Cast(convert(varchar(5),convert(time(0),[OrderDate]))+‘:00’asTime(0))as TimeForCache

  FROM [AdventureWorksDW2012].[dbo].[FactInternetSales] with (nolock)

2.       As you can see above we have converted out OrderDate to the Date data type.

3.       We have also converted our OrderDate to only show use the Time, using the data type of Time(0)

a.        NOTE: If this is not done in this way, when the time goes over 12pm it would then display it was 01pm instead of 13:00

4.       Now we can use the above query in our data flow task to Insert into our fact table.

 

Using the Lookup Transformation for inserting data into our Fact table

Below we will now use our TSQL Query, as well as our Lookups to our Date and Time Cache Transform

 

1.       Go into SSIS, and go to your SSIS Package where you want to insert your data into the Fact table.

2.       Then drag in your data flow task.

a.        If needed rename your data flow task.

3.       Now drag in an ADO NET Source.

a.        NOTE: The reason for the ADO NET Source is so that it will get the correct data types from our query.

b.       I have found in the past that the OLE DB Source at times does not find the correct data types for the Lookup Transform.

4.       We then put in the query from step 1 in the section above.

a.        NOTE: We went into the Advanced Editor to ensure that the Date Types for the DateForCache and TimeForCache columns are correct.

b.       clip_image035

c.        clip_image037

5.       Now we need to create our first lookup for our Date Cache Transform.

a.        In the SSIS Toolbox drag in the Lookup under the Common section

                                                               i.      clip_image039

b.       We then renamed the Lookup to Date Lookup

c.        Then drag the Blue constraint to our Date Lookup.

d.       Double click on the Date Lookup to go into the Properties

e.       On the General section we configured it with the following:

                                                               i.      clip_image041

                                                              ii.      NOTE: The reason we chose the Cache Connection Manager is so that we can use the cache file we created earlier.

f.         Then click on the Connection.

g.        If you configured your Date and Time Cache Connections as Project Connections you will see them in the Cache Connection Manager

                                                               i.      Click on the drop down and select Date Cache Connection

                                                              ii.      clip_image043

h.       Then click on Columns, and we configured it with the following:

i.         clip_image045

j.         As you can see above we have used our Input column which was DateForCache, and then mapped it to our Lookup Column called FullDate which we configured earlier.

k.        We then selected the DateKey as our Output Alias

l.         Then click Ok.

6.       Next we configured our  lookup for our Time Cache Transform.

a.        In the SSIS Toolbox drag in the Lookup under the Common section

                                                               i.      clip_image039[1]

b.       We then renamed the Lookup to Time Lookup

c.        Then drag the Blue constraint to our Time Lookup.

d.       When you do this it will pop up with the Input Output selection window

                                                               i.      For the Output select the Lookup Match Output

                                                              ii.      clip_image047

                                                            iii.      Click Ok.

e.       Double click on the Time Lookup to go into the Properties

f.         On the General section we configured it with the following:

                                                               i.      clip_image041[1]

                                                              ii.      NOTE: The reason we chose the Cache Connection Manager is so that we can use the cache file we created earlier.

g.        Then click on the Connection.

h.       If you configured your Date and Time Cache Connections as Project Connections you will see them in the Cache Connection Manager

                                                               i.      Click on the drop down and select Time Cache Connection

                                                              ii.      clip_image049

i.         Then click on Columns, and we configured it with the following:

j.         clip_image051

k.        As you can see above we have used our Input column which was TimeForCache, and then mapped it to our Lookup Column called Time which we configured earlier.

l.         We then selected theTimeKey as our Output Alias

m.      Then click Ok.

7.       Now finally drag in your ADO NET Destination.

a.        Drag the blue constraint to your ADO NET Destination.

b.       When you do this it will pop up with the Input Output selection window

                                                               i.      For the Output select the Lookup Match Output

                                                              ii.      clip_image052

                                                            iii.      Click Ok.

c.        Then go into our ADO NET Destination Properties and select your destination Fact table.

d.       And then ensure that your mappings are correct.

e.       If you scroll to the bottom you should see the mappings from your Lookups above to your Fact Table

f.         clip_image054

8.       Then finally run your SSIS Package and you should see the following:

SQL Server 2014 Observations

Today’s post is not going to be about explaining something but more along the lines of my experience on SQL Server 2014.

When I heard about all the new features in SQL Server 2014, the In Memory Tables, new cardinality estimator, new changes to how TempDB works, SSIS and Delayed Durability, I was excited to see how I could use these new features for my data warehouses.

I did a full backup, restore to a Virtual Machine (VM), test the upgrade process to SQL Server 2014 to ensure that when I completed the upgrade I would not have any unforeseen issues. (As well as test my backup files and my restore document to ensure that I could restore my system from a backup)

So the first thing that I did was to test the In Memory tables, and I was excited by using the In Memory table with no durability, this would enable my staging tables to work that much faster in getting data from the staging tables into my disk based Fact Tables. I tried a few variations in terms of what Index to use and different types of tables to use. As we are all aware due to this being the first release of In Memory tables, there are some limitations. I have now come to the conclusion for my current workloads that the new In Memory tables offer no real benefit for my staging tables in my data warehouse. I find that at times it is slower than the disk based tables. (I am aware that I could well be not using them correctly, but I did quite a lot of reading and testing to ensure that I was doing what was suggested).

Next I was hoping the combination of the cardinality estimator, changes to the TempDB and Delayed durability would increase the throughput of my queries, which would lead to faster load times from my staging tables into my Fact tables. I have monitored the time my SSIS Packages have taken to run, and unfortunately I cannot see any vast improvement in my times.

What I can say that is working in the new SSDT with Visual Studio 2013 does seem a lot quicker to develop and when debugging in my experience it starts a lot quicker than in the past. Yes it did take some getting used to all the new images for all the control flow items, data flow items etc, but that is a small price to pay. I do prefer the Dark Theme Smile 

So for me in my current environment I would say that all the new changes have not made any real significant impact on my data warehouse. And have not made it any quicker to load data and get it into my Fact table.

NOTE: I welcome anyone who has any suggestions or idea’s to please let me know. I am more than willing to learn as well as potentially not have configured it correctly.

SQL Server Integration Services (SSIS) – Using Send Mail Task for data validation checks

What I wanted to do was to be notified in my data if there were not enough rows of data, or if the financials were too low, indicating that there was a potential issue with the data coming in.

And if it was too low to fire off an email. If it was correct then to not send an email. SO that we would only be notified when there is an issue.

So in order to explain this I will explain it with an example below, which will make it easier to understand.

 

Example:

·         We want to compare the count of SQL Server Analysis (SSAS) Partitions for our Internet Sales Cube.

·         If the count of partitions increases then send an email.

·         If the count of partitions remains the same then do nothing.

 

You can get a copy of this here: http://msftdbprodsamples.codeplex.com/releases/view/55330

·         I used the AdventureWorksDW2012 Data File and AdventureWorks Multidimensional Models SQL Server 2012

 

You can refer to my previous blog post if you want to see how we populate the SSAS Partition Details

                PUT IN HERE

 

NOTE: We are using SQL Server 2014 and SQL Server Data Tools with Visual Studio 2013

 

 

1.       Create a new SSIS Package.

2.       Next we create the following 2 variables, which will hold our Previous and Current Partition Counts

a.        clip_image002[6]

3.       Next we will load the data into our variables using an Execute SQL Task and a SQL Server TSQL Script.

a.        NOTE: I normally would put the TSQL script into a stored procedure, but for this example I am going to put it in the package so that you can see how it got populated.

b.       From the SSIS Toolbox drag in an Execute SQL Task, then configure it with the following:

                                                               i.      As you can see we gave it a name, changed the result set to Single Row and created our connection

                                                              ii.       clip_image004[6]

c.        Then click on the Ellipses next to SQLStatement and put in the following TSQL code.

Declare @PreviousCountOfSSASPartitions asint

Set @PreviousCountOfSSASPartitions =

       (

         Selectcount(1)

         FROM [AdventureWorksDW2012].[dbo].[Mart_TD_SSAS_PartitionDetails] with (nolock)

         where [Parent_MeasureGroupID] =‘Fact Internet Sales 1’

         andconvert(date,CreatedTimeStamp)=‘2013-07-09’

       )

 

Declare @CurrentCountOfSSASPartitions asINt

Set @CurrentCountOfSSASPartitions =

       (

         Selectcount(1)

         FROM [AdventureWorksDW2012].[dbo].[Mart_TD_SSAS_PartitionDetails] with (nolock)

         where [Parent_MeasureGroupID] =‘Fact Internet Sales 1’

         andconvert(date,CreatedTimeStamp)>=‘2013-07-09’

       )

 

Select   @PreviousCountOfSSASPartitions as PreviousCountOfSSASPartitions

        ,@CurrentCountOfSSASPartitions as CurrentCountOfSSASPartitions

                                                               i.      Then click Ok.

d.       Now in the left hand pane click on Result Set and configure it so that we can populate our variables as shown below:

                                                               i.      clip_image006[6]

e.       Then click Ok.

4.       Next we are going to create a Send Mail Task so that when we get an error to send out the email.

a.        We dragged in a Send Mail Task and configured it to send via our Mail Server.

b.       Along with who to send it to, as well as the details in the email.

c.        NOTE: You can add expressions to add what you would to send dynamically.

                                                               i.      If you are going to put something in the actual email message, it is easiest to compile it in Notepad and then copy and paste it. This way it will keep your content as expected.

5.       Now we are going to use the precedence constraint in order to decide if we must send the email or now.

a.        Drag your precedence constraint from your Populate variables for Count of Partitions to our send mail task

                                                               i.      clip_image008[6]

b.       Then right click on the Success Precedence Constraint and select Edit

c.        We then configured it with the following as shown below

d.       clip_image010[6]

e.       For the Expression we configured it saying if the Previous Partition Count is less than the Current Partition Count then continue and send the email.

@[User::PreviousCountOfSSASPartitions] <  @[User::CurrentCountOfSSASPartitions]

f.         Then click Ok, and Ok again.

g.        You will now see that we have put in an expression

                                                               i.      clip_image012[6]

6.       Now when we run it when the partition counts are different it will send the email as shown below

a.        clip_image014[6]

7.       And then if we run it when the Partition counts are the same, it will NOT go to the send Mail Task

a.        clip_image016[4]