BI-NSIGHT – Power BI Desktop (Date Hierarchy) – Power BI(Timeline Slicer Visual, Pin Excel Range to Dashboard, Power BI User Groups, Weekly Service Update, Visual Studio Application Insights Content Pack) – Microsoft BI Reporting Road Map – SQL Server 2016 SSIS Updates

With the Pass conference completed last week there has been a lot of information about SQL Server 2016 and the future does indeed look bright for BI within the Microsoft BI stack. And I personally think that in the future they will be leading in most of the BI areas.

I am not going to go into all the details as this has been covered in a whole host of other blogs that I follow. And I am sure that a lot of the people reading my blog have already found out all the new amazing news.

So here we go with all updates from the past week.

Power BI Desktop – Date Hierarchy

The above screenshot was taken from Jen Underwood, which is showing how in the future version of Power BI Desktop it will have the ability to be able to create the Date Hierarchy for you. I am sure it will be in a not too distance release.

Power BI – Timeline Slicer Visual

As promised as Pass last week here is another great Custom Visual available in Power BI.

I can see this being used a lot, as I have used it in Excel in the past and it does allow users the ability to slice their data by Month to Date, Year to Date, Quarter to Date etc…

You can find all the custom Visualizations here: Power BI Custom Visualizations

Power BI – Pin Excel Range to Dashboard

Another feature that will be coming to the Power BI Service is the ability to Pin an Excel Range into your Power BI dashboard.

I think that this will be really useful, because Excel does some things really well. And often it can show you a lot more information, which can easily be digested instead of trying to replicate it Power BI.

Power BI – User Groups

This is another great incentive, as I think as the momentum grows with Power BI this will be a great way to network with like-minded people.

As well as learn from other people who I have no doubt will have some amazing idea’s and experience to share.

Here are the details if you are interested to Sign up or see if there are people in your area: Power BI User Groups are here!

Power BI – Weekly Service Update

There were some interesting updates this week, which is that you now get a guide in what you want to do in Power BI.

As well as now individuals can also sign up for Power BI. I do think that this is a very clever move. As there are a lot of people who potentially might use it at work, and then want to use it for their personal projects. Along with this you can get the general public to start using this service. And often this can attract a larger crowd than the amount of people that will be exposed to the Power BI service. Which in turn could get Power BI into a company!

And finally is the duplication of an existing report. Which often can help when you want it to be very similar and do not want to have to re-create it all from scratch!

You can find out all the details here: Power BI Weekly Service Update

Power BI – Visual Studio Application Insights Content Pack

This week’s Content Pack is about Visual Studio applications and can give you insight into your applications that you have created and can show you potentially where you have issues.

You can find out how to use the content pack and more details here: Explore your Application Insights data with Power BI

Microsoft BI Reporting Road Map

As has been blogged quite extensively it is the first time since I have started my career in BI, that there has actually been a roadmap for BI from Microsoft.

I have to say it is great that we now have this visibility, because it means we can plan for what is coming. And incorporate some of the new changes into our existing and to be delivered projects. Which means we will be in a position to show the people in our business something that is new and fresh.

And the way that I see it, people like to see things change. Not everyone in the business, but at times even if the charts just change slightly or there is something additional it can mean that there is great adoption. It also shows that it is not something that been developed and never looked at again!

I do feel that they are focusing a lot on SQL Server Analysis Services Tabular. And for good reason, this product is playing catch up. It is also being used in Power BI, which we all really love and are using more often. And I can see that we are also starting to get the best of both worlds. And by that I mean we are getting a lot of the functionality from SQL Server Analysis Services Multidimensional, as well as from TSQL. Which means that we can leverage the best of both.

You can find out all the information about the BI Reporting Roadmap here: Microsoft Business Intelligence – our reporting roadmap

SQL Server 2016 – Integration Services Update

The link below are all the updates from Wolfgang Strasser (w|t) with regards to all the great updates that are coming to SSIS 2016.

I am looking forward to see how the Package Control Flow Templates, as the way I see, this will mean that you can leverage creating the template once, and then reuse it again and again. So for example if you create a Package Control Flow Template for a Slowly Changing Dimension Type 2. You can then use this in your framework for all your other developers.

You can read his blog post here: SQL Server 2016 Integration Services (SSIS) –Summary of SQL Pass Summit 2015 session

The future for me

I was reading through the email from The Databse Weekly and I saw an article from Paul Randal with regards to Paul being your mentor. This really got me thinking and where is my future within Business Intelligence (BI).

I really love what I do. I get great satisfaction and enjoyment when I can tell people a story about their data. Or show them insights into their data for the first time.

I find it easy to grasp new concepts and get it all working. I work entirely within the Microsoft BI toolset.

With the new Office 365 and Power BI offerings there is a whole new world to get my head around. It really excites me.

I have also started working for a BI consulting company and it equally exciting times here. They are looking to expand and really get into the Microsoft BI space. That’s where I come into the picture.

I feel that I am at the right place at the right time. In terms of using my existing knowledge within the Microsoft space, as well as enabling the consultancy to grow from strength to strength going forward.

This is where I feel that the being mentored by Paul Randal could get me onto the next level, which is where I want to go and where I will eventually get to. Being mentored by someone as knowledgeable and experienced means I can get there a little quicker!

So where do I see myself in the future?

I see myself learning and going to that next level. Enabling businesses to gain insights to their data faster and quicker than ever. As well as driving the BI consultancy to become the go to consultancy for Microsoft BI.

Relocating to Australia – BI Job opportunities in Queensland

I thought I would let you guys know that I am about to relocate to Australia from South Africa. I have had an amazing time in South Africa, and learnt a whole lot whilst working at my past employer.

So this is a plug at anyone who has any lead or potential BI work in Queensland, Australia. I would really appreciate it, if you have anything to please email me.

I will be in Australia from 08 August 2014.

Below is a link to my CV and as well as my contact details.

CV-Gilbert Quevauvilliers-2014

Thanks

Gilbert

SSIS – Automating SQL Server Backups, copying files and checking that they were copied

I had an issue where I lost all my backups. So what I have now created is an automated SQL Server backup process, which will back up all the databases, verify the backups, copy them to the required location. And then finally email the backup files copied to ensure that they are all there.

               

Backing up the databases

1.       Below is the script which is created on the master Database.

2.       This backs up the databases all except the TempDB to the desired location.

3.       NOTE: WE ARE ALWAYS USING THE SAME FILENAMES AND OVERWRITING THE CURRENT BAK FILE, BECAUSE WE ONLY NEED THE LATEST COPY.

a.       THIS ALSO SAVES SPACE FOR THE BACKUP FILES.

4.       Here is the example of the script to run to create it, and some notes afterwards

USE[master]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_BackupDatabases]    Script Date: 2013-06-06 01:58:41 PM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— =============================================

— Author:           Gilbertq     

— Create date: 07 June 2013

— Description:     

— =============================================

CREATEPROCEDURE[dbo].[prc_BackupDatabases]

AS

BEGIN

 

       SETNOCOUNTON;

 

 

DECLAREUserDatabases_CTE_CursorCursor

FOR

 

— Selecting user database names.

selectnameasDatabaseName

fromsys.sysdatabases

where ([dbid])<> 2

 

OPENUserDatabases_CTE_Cursor

DECLARE@dbNamevarchar(100);

DECLARE@backupPathvarchar(100);

DECLARE@backupQueryvarchar(500);

 

— make sure that the below path exists

set@backupPath=‘C:\SQLBackups\’

 

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

While (@@FETCH_STATUS<>1)

 

BEGIN

— Backup SQL statement

set@backupQuery=  ‘backup database ‘+@dbName+‘ to disk = ”’+@backupPath+‘SERVERNAME-‘++@dbName  +‘.bak” WITH COMPRESSION,INIT’

 

 

— Print SQL statement

print@backupQuery

 

— Execute backup script

–Select (@backupQuery)

EXEC (@backupQuery)

 

— Get next database

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

END

 

CLOSEUserDatabases_CTE_Cursor

DEALLOCATEUserDatabases_CTE_Cursor

 

 

SETNOCOUNTOFF;

END

 

GO

a.        NOTE: You will have to modify the following in the above script for it to run on your environment:

                                                               i.      @backupPath

1.       Change this to where you want to back up your SQL BAK Files.

                                                              ii.      Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name

1.       This was because we were backing up multiple BAK files from multiple servers.

b.       This is what it looks like in SSIS

c.        clip_image002[4]

 

Verify the SQL Server Backups

1.       The next step is to then verify that the backups are consistent and can be restored.

2.       Again we created this script on the Master database

3.       Here is the create script:

USE[master]

GO

 

/****** Object:  StoredProcedure [dbo].[prc_VerifyDatabases]    Script Date: 2013-06-06 02:09:05 PM ******/

SETANSI_NULLSON

GO

 

SETQUOTED_IDENTIFIERON

GO

 

— =============================================

— Author:           Gilbertq     

— Create date: 07 Jun 2013

— Description:     

— =============================================

CREATEPROCEDURE[dbo].[prc_VerifyDatabases]

AS

BEGIN

 

       SETNOCOUNTON;

 

 

 

DECLAREUserDatabases_CTE_CursorCursor

FOR

 

— Selecting user database names.

selectnameasDatabaseName

fromsys.sysdatabases

where ([dbid])<> 2

 

OPENUserDatabases_CTE_Cursor

DECLARE@dbNamevarchar(100);

DECLARE@backupPathvarchar(100);

DECLARE@backupQueryvarchar(500);

 

— make sure that the below path exists

set@backupPath=‘C:\SQLBackups\’

 

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

While (@@FETCH_STATUS<>1)

 

BEGIN

— Backup SQL statement

set@backupQuery=  ‘Restore VERIFYONLY from disk = ”’+@backupPath++‘SERVERNAME-‘+@dbName  +‘.bak” ‘

 

— Print SQL statement

print@backupQuery

 

— Execute backup script

–Select (@backupQuery)

EXEC (@backupQuery)

 

— Get next database

FetchNEXTFROMUserDatabases_CTE_CursorINTO@dbName

END

 

CLOSEUserDatabases_CTE_Cursor

DEALLOCATEUserDatabases_CTE_Cursor

 

 

SETNOCOUNTOFF;

END

 

GO

a.        NOTE: You will have to modify the following in the above script for it to run on your environment:

                                                               i.      @backupPath

1.       Change this to where you backed up your SQL BAK Files.

                                                              ii.      Within the @backupQuery we have also put in the SERVERNAME into the BackupFile name

1.       This was because we were backing up multiple BAK files from multiple servers.

b.       This is what it looked like in SSIS

c.image

 

Copying files to backup location on the network

1.       Here we are then copying our backup files to a network location where they are then backed up.

2.       It is a simple File System Task, where we are copying all the files from one folder to another folder.

a.        The only thing to NOTE is that we are overwriting the files when we copy them over.

3.       This is what it looks like, simple to setup and create.       

image

 

 

Getting the file list and emailing it to the required users

1.       The final part is where I want to verify that the backups were backed up and copied over to the network location.

2.       The first part is using the PowerShell script to get the file listing into CSV.

a.        NOTE: I created a mapped drive to our backup location in order to make it easier in the script.

                                                               i.      And to also ensure that there is no authentication issues.

3.       All the files were saved in a Folder called PowerShell

4.       This is the PowerShell script that was used to get the file listing and exported to CSV

Get-ChildItem r:\*.*  -include SERVERNAME*.* | select name,length,LastWriteTime  | Export-Csv C:\SQLBackups\Powershell\SERVERNAME-BackupFileList.csv

a.        This was saved with the following filename:

                                                               i.      Filelisting_SERVERNAME.ps1

5.       Next this is how we configured it to run the PowerShell script in SSIS

a.        Drag in an Execute Process Task and configure it with the following:

                                                               i.      We gave it the following name:

1.       PowerShell to get Directory Listing for SERVERNAME Files

                                                              ii.      Then click on the Process on the left hand side.

                                                            iii.      This is where you actually configure how you will run the Execute Process

1.       Where it says Executable you have to put in the location for your executable.

2.       In our example this is the location for PowerShell

C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe

3.       Next is the Arguments that you want to pass with your executable

4.       In our example we put in the PowerShell script to call:

-ExecutionPolicy ByPass -command “. ‘M:\SQLBackups\Powershell\Filelisting_SERVERNAME.ps1′”

b.       You can leave all the other defaults so that it looks like the following below:

image

6.       The next step is to then go through the process of Importing the data from the CSV file into the table called:

Staging.tb_BackupFileListing

a.        NOTE: In this table we store when the file was modified as well as when data was imported into the table.

7.       We then import this data into the Fact Table so that we have got the history stored, into the table called:

a.       TF_BackupFileListing_Converted

8.       We then export just the current Backup File Listing data into an Excel Spread sheet which is saved locally into the PowerShell Folder.

9.       The final step is where we then attach the exported spread sheet and email it to the required people.

image

10.    This is what this package looks like:

a.image

 

SSIS & TSQL – Geo- location from IP Address to Country

Below are the details firstly from getting your Geo – location information from a table which has your IP Address to country information.

And then getting your IP Address information from your source systems and quickly getting this to a Country so that you can see where the people are coming from.

 

Geo – Location information

There are a lot of Geo – Location places where you can download the IP Address ranges which then relate to countries.

NOTE: Due to IP Address ranges changing over time, it is a good idea to ensure that your IP Address Range Info table gets updated at the very least once a month.

 

·         You can download it from the following location:

o    http://geolite.maxmind.com/download/geoip/database/GeoIPCountryCSV.zip

 

You can use the following below to select only Valid IP Addresses

o    where[IP Address]not like‘%[^.0-9]%’

 

Using SSIS, tables, indexes and stored procedures to get from IP address to country as quickly as possible.

Below are the additional steps to get the data from an IP Address mapped to a country?

 

Source Tables

Below is a list of source tables which you will require, and will be used later

 

1.       IP Address Range Info Table

a.        This is the table that has the IP Address ranges, which map to a Country (Normally a CountryID)

b.       NOTE: You can create this table from the download above.

c.        Once you have your table one of the biggest things is to create the index below:

CREATE CLUSTERED INDEX[IXC_Start_End_IP] ON [dbo].[tb_IPAddressRangeInfo]

(

       [StartIPAddress]ASC,

       [EndIPAddress]ASC

)WITH (PAD_INDEX=OFF,STATISTICS_NORECOMPUTE=OFF,SORT_IN_TEMPDB=OFF,DROP_EXISTING=OFF,ONLINE=OFF,ALLOW_ROW_LOCKS=ON,ALLOW_PAGE_LOCKS=ON,FILLFACTOR= 100,Data_Compression=Page)

 

GO

                                                               i.      NOTE: In the above Index we created a FILL Factor of 100% and we also compressed the Index.

1.       Also remember to update the index when you put new data into your IP Address Range Info Table.

2.       Country Table

a.        This is the table which has the mapping from CountryID to Country Name

 

Converting from IP Address to IP Number

Next what you will need to do is to convert from an IP Address to an IP Number.

NOTE: This is required because the IP Address Range Info table has the IP Ranges stored as IP Numbers.

 

1.       What we did in order to make the Process quicker in terms of getting it from an IP Address to country is we first only selected the distinct IP Addresses.

a.        NOTE: here it was just a simple select. Also note what we used to only get valid IP Addresses

Select Distinct([IP Address])

 

from staging_tb_ClientSourceInformation with (nolock)

where [IP Address] not like‘%[^.0-9]%’

Option (Fast 10000)

b.       We then put this into a table called:

                                                               i.      [dbo].[staging_tb_DistinctIPAddresses]

2.       The next step is we now are going to Update our column called IPNumber.

a.        We do this by using the following below which is in a proc to calculate the IP Number:

UPDATE

— This will be our Staging Table

       dbo.[staging_tb_DistinctIPAddresses]

SET

       [IPNumber]=  (CAST(dbo.fn_Parse([IP Address],‘.’,1)  ASBIGINT)* 16777216)+(dbo.fn_Parse([IP Address],‘.’,2)*65536)+(dbo.fn_Parse([IP Address],‘.’,3)* 256)+dbo.fn_Parse([IP Address],‘.’,4)

WHERE

— This works to ONLY get the IPAddress not like ‘%[^.0-9]%’

       [IP Address] not like‘%[^.0-9]%’and

       [IPNumber]    ISNULL                                 AND

       [CountryStateSK]ISNULL

                                                               i.      NOTE: We are using the Parse Function in order to convert each octet into our IP Number.

b.       At the same time we are also updating our destination table with the IP number, so that later when we join back to get the CountryID we have a column to match on.

                                                               i.      NOTE: The query is identical to above, but we are just changing the table we are updating.

3.       Now this is the part that used to take the longest, where we are looking at the IP Address Range Info and then finding the CountryID and then inserting this into a new table with just the IPNumber and CountryID

a.        The Table we insert into is called:

                                                               i.      [dbo].[Staging_tb_DistinctIPAddressesWithCountrySK]

b.       NOTE: The reason for the Insert into a table is because that is quicker than doing an update when the dataset gets large.

c.        NOTE 1: This is also the section where we are using the new TSQL Syntax to find the CountryID from the IP Address info Range as quickly as possible.

d.       Now here is the proc where we do this:

Select C.ID as CountryStateSK,IPNUmber

 

FROM

       dbo.[staging_tb_DistinctIPAddresses]            A

 

LEFTOUTERJOIN

       [dbo].[tb_Country]   CON

              C.ID (select CountryId from [dbo].[tb_IPAddressRangeInfo]

where StartIPAddress=(select max(StartIPAddress) from [tb_IPAddressRangeInfo] where StartIPAddress<=IPNumber)

and EndIPAddress >=IPNumber) 

 

where  

       A.[IP Address] not like‘%[^.0-9]%’                           AND

       A.[IPNumber]IS NOT NULL

                                                               i.      As you can see above the section highlighted in RED is where we are getting the CountryID from our IP Number Range

e.       We take the output of this data and insert into an additional staging table.

                                                               i.      As explained above using the new TSQL Syntax to find the CountryID from the IP Address info Range as quickly as possible

4.       The last part is where we now use our Distinct IPNumbers and CountryID to join back to our Source Table based on the IP Number.

a.        And here we then update the Source Tables CountryStateSK with the relevant details as shown below:

Update dbo.Staging_tb_ClientSourceInformation

Set CountryStateSK I.CountryStateSK

from [dbo].[Staging_tb_DistinctIPAddressesWithCountrySK]asIwith (nolock)

       Inner join dbo.Staging_tb_ClientSourceInformationasSwith (nolock)

              on i.IPNumber S.IPNumber

 

— The reason that this is set to 1580 is because that is the Row we inserted where we have no data

Update dbo.staging_tb_ClientSourceInformation

Set CountryStateSK = 1580

from dbo.staging_tb_ClientSourceInformationwith (nolock)

where CountryStateSK is null

                                                               i.      As you can see above this is a simple update statement based on the IP Number.

 

Reference to the Parse Function

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

Create FUNCTION[dbo].[fn_Parse](@String       VARCHAR(500),@Delimiter    VARCHAR(10),@Position              INTEGER)

RETURNS VARCHAR(500)

AS

 

–DECLARE @String          VARCHAR(500)

–DECLARE @DelimiterVARCHAR(10)

–DECLARE @Position        INTEGER

—     SET @Delimiter = ‘|’

—     SET @Position = 3

—     SET @Counter = 1

 

BEGIN

       DECLARE @Counter           INTEGER

       DECLARE @Return_Value      VARCHAR(500)

 

       SET @Counter= 1

 

       WHILE @Counter<@Position

       BEGIN

 

              SET @String=  RIGHT(@String,LEN(@String)-CHARINDEX(@Delimiter,@String))

 

              SET @Counter @Counter 1

 

       END

 

       SET @Counter CHARINDEX(@Delimiter,@String)

 

       SET @Return_Value=

              CASE

                     WHEN @Counter > 0 THEN LEFT(@String,@Counter1)

                     ELSE @String

              END

 

       RETURN (@Return_Value)

 

END