Refreshing data in Power BI from SQL Server Read-only or Secondary Replica’s

I have seen a lot of questions in the Power BI Community asking how to connect to a SQL Server Read-Only or Secondary replica, and whilst this is not currently available in the Sql.Database M connector, it is available in the OLE DB connector.

NOTE: This could potentially be used for a whole host of other OLE DB providers

NOTE II: This only applies to Imported datasets at the time of writing this.

In my example i am going to be using the OLE DB Connector for SQL Server Native Client 11.0

I will explain below how to create the connection, as well as how to configure it in the Power BI On-Premise Data Gateway.

Connecting to the Data in Power BI using the OLE DB Connector

  • The first thing I do, is to connect to the data, by clicking on Get Data and then finding my OLE DB Connector
    • I then click on Connect
  • The great thing about Power BI, is that after I click on Connect, there is a button to Build the connection string, so I click on Build
  • This brings up the Data Link Properties Window
    • I click on SQL Server Native Client 11.0
    • I click on Next
  • I then put in my Server Name, select Use Windows NT Integrated Security and select my database
    • NOTE: Depending on how your Gateway authenticates to SQL Server will depend on what you put in here.
    • NOTE: I always put in the FQDN name for the SQL Server instance.
    • I then click on Test Connector to make sure that I can connect.
  • I then click on All, where it says Application Intent, it will default to READWRITE
    • NOTE: This is the IMPORTANT part where I want to make sure that it will use the Read-Only replica or secondary replica
    • I clicked on Application Intent and then clicked on Edit Value
    • I then clicked on Ok to complete the changes
  • And then I clicked on Ok again.
  • I can now see my connection string built for me
    • NOTE: If I scroll across I can see and confirm that I have set the Application Intent to ReadOnly
    • NOTE: Make a copy of the connection string above. This will be used when creating the data source in the On-Premise Data Gateway.
  • There is an option under Advanced Options, if you want to put in a SQL Statement, that is totally up to you, or you can click Ok.
  • I was then prompted for the OLE DB Provider credentials.
    • As far as I understand this is so that the credentials are stored separately from the connection string and not stored in clear text.
    • I selected Windows and then “Use my current credentials”
    • I then clicked Connect.
  • I was then prompted with the Navigator which allowed me to select the table that I wanted
    • I then selected the table that I wanted and clicked Edit
    • NOTE: I always prefer to Edit the queries in the Power Query Editor before loading them into the data model.
  • I then added it to my data model
  • And uploaded it to the Power BI Service.

Creating the Data Source in the Gateway in the Power BI Service

NOTE: You will have to ensure that you have got the permissions to add data sources to the Gateway in the Power BI Service.

  • I went into the Power BI Service and clicked on Manage Gateways.
  • I found my associated Gateway, then clicked on the Ellipses and selected Add Data Source
  • I then configured it with following:
    • Data Source Name
    • Data Source Type
    • Connection String (NOTE: You can get this from your PBIX file if you did not copy it from the earlier step
    • Authentication Method, I chose Windows. Yours possibly will be what you connected to in Power BI Desktop.
    • I then put in the username and password
    • Finally, under Advanced settings I selected Privacy Level to None
    • I then clicked Add
  • I then got the confirmation that it was successfully created
  • I then went into my App Workspace where I had uploaded my PBIX, then into the settings and configured it to use the Gateway.
    • I then clicked Apply.
  • I tested the refresh and it successfully refreshed
  • My final test was to sit with the DBA and ensure that when my data was being refreshed that it was reading from the read-only/secondary replica which it was.
    • I always suggest double checking to make sure that it is working as expected, and to not assume that it is working.

Conclusion

In this blog post I have showed how you can now connect and import data from a read-only or secondary SQL Server replica. This has really been great for some of my customers who want to ensure that the data refresh queries will not impact the data warehouses or production systems.

Please feel free to leave any comments or suggestions, I do enjoy getting them.

Additional Note

I needed to ensure that I had the latest versions of the OLE DB Providers installed for this to work as expected.

I would suggest making sure if you are going to use the native OLE DB provider that you install the latest version from here: Microsoft OLE DB Driver for SQL Server

Power BI RoundUp – (Nov 2018 Desktop release – AI Capabilities for Power BI)

Well this month has sure been a busy month with a whole host of additional updates for Power BI. I personally am really looking forward to the next few months to better understand how all of this will evolve and assist customers on leveraging their data assets.

Power BI – Nov 2018 Desktop

This was yet another massive update to Power BI Desktop with a lot of the most requested features being released. As I do every year below are the features which I would like to highlight in this release. If you are interested in all the details please read the entire blog post from the Microsoft Power BI team.

Reporting

As shown in the picture above there is now the option to have expand and collapse buttons for tables and matrix. You can also do this from the right click context menu. And a great feature when you do not want to expand all the rows, but rather a particular area.

Another new feature is the capability to be able to copy visuals between PBIX files. This will work well if you have two PBIX files that have the same datasets, then the visual will be copied across. If it does not, then you will get a warning with regards to this.

The new filtering experience is really a great new feature. Not only can you now let users easily change the filters on the data as they require. It is also customizable in terms of the look and feel. Along with that, you can also select which filters can be changed and which filters are locked.

As you can see above, when you are on a visual you can hover of the filter icon and it will show you what is filtering the visual. That is AWESOME

As you can see above there are now additional Hotkeys for accessibility options, which is great to see that they are expanding on these capabilities.

The above now also supports keyboard navigation, screen readers and high contrast.

Analytics

There is a new feature where you can now use colour saturation on visuals with conditional formatting, in all 3 forms which is by colour by scales, colour by rules & Colour by field.

It applies to the following visuals below.

  • All variants of column and bar charts
  • Funnel chart
  • Bubble & filled maps
  • Shape map (preview)
  • Treemap
  • Scatter chart

They have expanded the Q&A features to now allow you to be able to ask related questions in the Q&A explorer. You do this by clicking on the “Ask a related question” which will allow you to keep asking questions based on your previous question.

The blog post also gives a bit more details on terms of what additional question types you can ask.

Modeling

When you open up Power BI Desktop Nov 2018, you will see the above button on the right hand side. This new modelling experience allows you to do the following:

  • Customize and save multiple diagram layouts
  • You can see the modelling options through the field properties pane and field list.
  • Capability to apply settings to multiple columns at once.
  • And finally, the option to create display folders, which I know is something I have been waiting for, as well as a lot of other people have been asking for this feature.

Composite Models are now Generally available in the Power BI Service. Amazing how quickly that went into GA!

There are 3 new DAX functions which are below.

Daniil already has got a great blog post explaining how to use this new IsInScope DAX function : New DAX function: ISINSCOPE

Custom Visuals

Below are the actual visualizations for the new custom visuals that have been released. If you are looking for more details, please read the blog post from the Power BI Team (link at the end of this)





Data Connectivity

As shown above there is now a beta connector to Azure DevOps Server to import and report on your Azure DevOps data.

There have been improvements to the PDF connector to allow you to specify a start and end page of your PDF file to look for tables within the PDF file.

The Azure Consumption Insights connector now includes Budget and Reserved Instances.

All the blog post details can be found here: Power BI Desktop November 2018 Feature Summary

Power BI – AI Capabilities

There is a new private preview of some new and exciting Power BI AI features that are coming.

The new capabilities are looking to include the following:

  • Using Azure Cognitive Services which will allow for image recognition and text analytics directly in Power BI.
  • Key Driver Analysis to better understand what drives key business metrics
  • Automated Machine learning models directly in Power BI
  • And finally, the capability to be able to bring your own Machine Learning models into your Power BI datasets.

The future of AI and Power BI sure does look bright.

You can find all the details here: Announcing new AI Capabilities for Power BI to make AI Accessible for Everyone

Power Query – A function to remove spaces within Text values

UPDATE: 19 Nov 2018

With the recent announcement of dataflows in the Power BI Service, I see more people will be looking to better understand and leverage dataflows by using the M language which is available in Power BI Desktop, Power Apps and Microsoft Flow.

I had a great response to this blog post, and both Ted and Daniil had a much easier solution to remove spaces from the data in a column. I am not sure why this originally did not work for me, but I am always happy to learn from others. As well as find an easier way to achieve the same outcome.

The solution is all that you need to do, is to right click your column and click Replace Values, as you can see below I am searching for the space and replacing with no space

 

 

 

BELOW IS THE ORIGINAL BLOG POST

The Power Query function below will remove any spaces that I have in a text field.

I got the original Power Query function from Ken Puls blog post Clean WhiteSpace in PowerQuery which he does a great job of removing leading, trailing or multiple spaces within text.

My requirement was to remove any spaces within the text too.

I created a Blank Query in the Power Query Editor and named it fx_ReplaceSpaces

Below is the code that I actually used.

(text as text, optional char_to_trim as text) =>
let
char = if char_to_trim = null then " " else char_to_trim,
nbsp = if char=" " then Character.FromNumber(160) else "",
split = Text.SplitAny(text, char & nbsp),
removeblanks = List.Select(split, each _ <> ""),
result=Text.Combine(removeblanks, char),
NonBlank = Text.Replace(result," ","")
in
NonBlank

And here is what it looks like with some sample data

And this is the output, where I wanted all spaces removed

There might be a more elegant way to achieve this, so if anyone has got any suggestions please let me know, I will be happy to test and update this blog post.

Power BI RoundUp with SQL Pass updates – Dataflows – Paginated Reports (SSRS) – Premium Metrics App with Data Flows & Paginated Metrics – Paginated Reports in Mobile App – Mobile & Service Update for Oct – Bonus features coming to Power BI Desktop

As expected this week with SQL Pass, there has been a whole host of new capabilities and features in Power BI, so please see my roundup below with the relevant links below.

And even though there is a lot of announcements here, we are still waiting for the update to Power BI Desktop!!

Power BI Dataflows

One of the biggest updates and releases to Power BI is dataflows. I personally think that this is a MASSIVE differentiator from other players in the Self-Service space.

The first thing I would like to highlight is that this is available to both Power BI Pro and Premium Customers. Below is a table of the key differences.

Dataflows allows you to now build data purely inside the Power BI Service, that stores the data in Azure Data Lake Gen2.

There are so many advantages to this:

  • Not only does it allow the business user to easily access data and use it in their data model as a data source (like SQL Server)
  • It also allows for the linking of entities (Premium feature)
  • It allows for Incremental Refreshing (Premium Feature)
  • Linking of entities which works like Excel (Premium Feature)

I would highly recommend that you read the blog post here: Introducing: Power BI data prep with dataflows

And here is the link to the Whitepaper: Understanding Dataflows in Power BI

Power BI Paginated Reports (SSRS)

There is now the capability to be able to use SSRS in the cloud. This is really great news and I am certain that there are a lot of customers who have been patiently waiting for SSRS to be available in the cloud.

It is now available in Power BI Premium. And along with this out of the box there are a lot of features that are already there, and some additional features which will come later. No doubt it is a very different set of requirements to build something that will work in the cloud vs an On-Premise implementation.

Some of the potential new capabilities which will be coming in the future to Paginated reports are:

  • Support for scheduling and sending paginated reports using e-mail subscriptions and having attachments
  • Embedding in third-party applications
  • Authoring reports against Power BI datasets
  • The ability to seamlessly drill through from a Power BI report to a paginated report in the service

All the details can be found in the blog post: Public Preview of Paginated Reports in Power BI Premium Now Available

Power BI Premium Metrics App includes metrics for Dataflows & Paginated Reports

It is incredible to see the pace of innovation from Microsoft and the Power BI team. Even though both dataflows and paginated reports have just been released you can now already use the Power BI Premium metrics app to see metrics for dataflows and paginated reports.

You can now see the following metrics for Paginated reports

  • Total views of the report with average for row count
  • Total time spent on the reports between retrieval/processing/rendering
  • Split by hour, dataflow name, and workspace name

The following is also available for data flows

  • Total refresh count
  • Refresh reliability
  • Average/max duration and average/max wait times of dataflow refreshes by dataflow name and workspace name

You can find more details on the Power BI Premium Metrics App here: Paginated report metrics and dataflow metrics now available in Premium Capacity Metrics app

Power BI Mobile App with support for Paginated Reports

If you update your Power BI Mobile App, you will now have the capability to be able to see paginated reports directly within your Power BI mobile app.

Once again it is incredible to see all the new functionality released across the entire platform, not only to the Power BI Service, but to the mobile app also.

It is available in the Android, iOS and Windows Apps

More details can be found here: Power BI Paginated Reports also available in Power BI Mobile Apps (preview)

Power BI Mobile and Service updates for Oct 2018

A lot of the updates for the Power BI Mobile and Service app have already been covered in other blog posts.

Below is a list of the updates with the relevant links

DAX Measure – Getting Difference between 2 values in a table

I had a requirement where the customer wanted the difference between 2 values in a table.

The challenge here was that because the values are already part of the table, I had to find a way to get the unique value for each item. This would then allow me to calculate each value separately and then calculate the difference.

This is what the final output looked like, and below is the DAX measure that I created to get the difference.

As my good friend Matt always says break it down into smaller parts to solve the puzzle. In this instance I know that I needed to have two measures, where one would be filtered for the first year and the second one would be filtered to the last year.

Difference between Years = 
VAR AllYears =
    CALCULATE (
        CONCATENATEX (
            VALUES ( 'Table1'[Year] ),
            'Table1'[Year] ,
            ","
        )
    )
VAR FirstYear =
    RIGHT ( AllYears, 4 )
VAR LastYear =
    LEFT ( AllYears, 4 )
VAR FirstYearAmount =
    CALCULATE (
        [Sales],
        'Table1'[Year] =  FirstYear 
    )
VAR LastYearAmount =
    CALCULATE (
        [Sales],
        'Table1'[Year] = LastYear 
    )
RETURN    FirstYearAmount – LastYearAmount

Here is the explanation of my measure

  • Lines 2 – 9
    • I am getting a concatenated list of all the years that are selected separated by the comma
    • EG: For 2017 and 2018 it is returning the following.
  • Lines 10 – 11
    • This is where I am getting the first year from my concatenated list of years
  • Lines 12 – 13
    • This is where I am getting the last year from my concatenated list of years
  • Lines 14 – 18
    • This is where I am creating my measure which will be for the first-year amount
  • Lines 19 – 23
    • This is where I am creating my measure which will be for the last-year amount
  • Lines 24
    • This allowed me now to subtract one measure from the other.
    • And return the final result

This will currently work, if multiple years are selected it will always select the first and last year from the entire selection.

As always, I hope that you enjoyed this blog, and if you got another way to get the same result I would be interested in hearing about it in the comments below.

Power BI RoundUp – Webinar on Advanced Data Prep – Data Flows in Power BI – On-Premise Data Gateway Update – New Power BI Regions in Azure – General Availability of Custom & Certified Data Connectors

Here is my weekly Power BI related round up, I hope that you find this information relevant and keeps you up to date with everything that is happening with regards to Power BI.

Power BI – Webinar on Advanced Data Prep with Dataflows

https://community.powerbi.com/t5/Webinars-and-Video-Gallery/10-23-18-Webinar-Advanced-data-prep-with-Power-BI-data-flows/m-p/534100 

There is an upcoming Webinar on 13 November 2018, where Anton Fritz will introduce you to what Power BI Data Flows are and how they can be used.

The Webinar details can be found here: With Power BI dataflows for unified data and powerful insights

Blog – Dataflows in Power BI

If you are interested in Power BI Dataflows, Matthew Roche from the Power BI team has a series of 6 blog posts where he goes into a lot more details on data flows, what they are and how they can be used.

I have read all of them and I would recommend reading through them if you are interested in how you can leverage Power BI dataflows going forward.

As well as hit the ground running.

Blog post details here: Dataflows in Power BI

Power BI – On-Premise Data Gateway Update for Nov

There is an update for the On-Premise Data Gateway for November, where they have updated the Gateway with the following:

  • Enhancements to the Vertica and OData connectors
  • Public preview for the SAP HANA & BW Single Sign On (Kerberos) Support
  • Improved Diagnostics
  • October 2018 Mashup Engine

You can find all the details here: On-premises data gateway October update is now available

Power BI – New Power BI Regions

It was announced this week that you can now get Power BI in 3 new Azure Regions which are:

  • Central India
  • Australia East
  • Central US (Iowa)

You can find more details here: New Power BI regions in India, Australia, and the United States

Power BI – General Availability of Custom & Certified Data Connectors

Custom & Certified data connectors are now generally available in the Power BI Service.

This will now allow you to not only connect to any data source but also have the functionality to be able to refresh the data via the On-Premise Data Gateway

You can find more details here: Announcing General Availability of Custom and Certified Connectors for Power BI

Running Power BI Reports or Dashboards in Fullscreen without the Buttons

I had seen some questions in the Power BI Community when asking how to run reports or dashboards in a browser screen so that they cannot see they buttons on the bottom left hand side.

In my example I am going to use my existing dashboard that I want to display where it does not show anything from Power BI, nor show anything from the browser that I am using.

The first thing that I do, is I navigate to my Dashboard, and then put the following at the end of the URL

?chromeless=true

EG:

Which then resulted in the following shown below without having any of the Power BI items showing

My final step to get rid of the Browser details was to press F11

As you can now see above it is just showing the Power BI Report.

And as long as this is on a big screen, it will not show any other buttons because there is no mouse moving or interactions.

Conclusion

I do hope that you found this blog post useful and if you have any other ways to achieve the same thing or got any comments please leave it in the section below.

Quick Tips for Aggregations & Composite Models in Power BI

I have been working with a customer on the implementation of Aggregations with Composite Models in Power BI and I have found that there are a few useful tips to assist you with getting it up and running.

Tip 1 – Always ensure that queries start off as DirectQuery

When starting to bring in the composite models the first thing to do is to ensure that all your queries start off as DirectQuery.

The biggest reason is that you can convert a table from DirectQuery to Import, but unfortunately you cannot do it the other way around (from Import to DirectQuery)

Another thing to make a note of, is if you go back into the DirectQuery table and click on Source it will show you that it set to Import, make sure you select DirectQuery again.

Change it back to DirectQuery as shown below

Finally, you can confirm that your table storage mode is set to DirectQuery by right clicking on the table and then going into the Field Properties.

And then viewing the drop down from the storage mode and ensure that it is set to DirectQuery

Tip 2 – Adding a Count of Rows in an Aggregated Table

I watched this awesome video How to use the Power BI Aggregations Feature by Adam Saxton (@GuyInaCube) and Christian Wade (@_christianWade), where they explain how to create Aggregations in Power BI, which I highly recommend you watch, it gives some great insights on how to get it working.

One of the nuggets of pure GOLD, was to add in a count table rows in your aggregated table.

The reason is that it can then be used in other measures that might not be defined as an aggregation type in your aggregation table. The example that they use where you are looking for an average, which in the internal engine, is a SUM divided by the COUNT.

Tip 3 – Distinct Counts on an Aggregated Table

Another great tip from the video above is if you want to be able to do distinct counts, you will need to specify a GroupBy for the column where your DISTINCTCOUNT measure will be applied.

If I recall correctly Christian did mention that a distinct count is only efficient up to 3 – 5 million values. As well as it does depend on the internal engine on where it will source the distinct count from, be that the aggregated table or the DirectQuery table.

Tip 4 – Where possible use SQL Server Views or Similar in your DirectQuery Source

I have found that when connecting using DirectQuery directly to the table there is no option to remove or alter the columns that are returned for the entire table.

The reason being is that there might be some columns that are not needed in the DirectQuery result set. Not only that but the developer of the view can abstract a lot of the details into the underlying view, which makes the DirectQuery return the data faster and only with the required information.

In some instances, depending on what steps are being applied where it could change the query from DirectQuery to Import Mode.

Along with this by using TSQL Queries there is also the possibility to use Indexed Views which could assist the DirectQuery performance.

Tip 5 – How to check if my Aggregated Table is being used

Currently DAX Studio does not support this, but once again it was shown in a preview build (in the video link above) that it will be coming to DAX Studio in the near future. Until then this is how you would check to see if your aggregated table is being used.

Open DAX Studio and connect to your current PBIX file. Once it is open on the bottom left it will have the Port Number as shown below.

Next open up SQL Server Management Studio, if you do not have it installed you can download the latest version here

Then select Tools and SQL Server Profiler

Then select the Server Type of Analysis Services.

Where it says Server Name, put in the details you got above from DAX Studio

Then click Connect

This will then bring up the Trace Properties window. Click on the Events Selection at the top

Then near the bottom right hand side click on Show all events

Then scroll down to the section where it says Query Processing and select the following “Aggregate Table Rewrite Query”

Then click Run.

Now as per the Microsoft documentation you can test and run some queries, and what you are looking for is where the “Matching Result” returns a “matchFound” against your Aggregated table. As in the example below it has found a match on the mapping table called “Sales Agg”

I have tested this out watching all the queries as they run, and I do not know how they do it, but it makes its decision almost instantly which table to use.

Conclusion

I hope that you have found these Aggregation and Composite Model tips helpful.

If anyone has got any other tips, please share them in the comments below, and I will happily update my blog post.

Power BI RoundUp – (Desktop Update Oct 2018 – Aggregations for Petabytes Scale – Webinar Ask Anything on Power BI Premium – Power BI Report Server Timing)

I am sure that there were a few people who were eagerly waiting for the Power BI Desktop Update for October 2018. I sure was one of them and it was well worth the wait. Please read all the details below, as well as other updates that happened in the past week.

Power BI – Desktop Update Oct 2018

There has been yet another massive update to the Power BI Desktop release for October 2018, and as I do every month I will highlight what I feel is relevant in my opnion.

Reporting

There is now the ability to be able to search within a filter for items. This is especially useful when there are a lot of items in your filter.

Every month there are more accessibility features that are being built into Power BI. This to me shows that Microsoft and the Power BI team are serious about not only helping people with accessibility challenges to view reports, but also allow these same people to create incredible Power BI reports.

As I showed in the initial image there has been a big performance improvement for ArcGIS maps. They respond now almost instantly and zoom into the area that you selected, which makes the reports that much faster. And who does not like near instant response times?

Modelling

Another BIG improvement has been to the DAX editor. Now there are line numbers, ident lines, as well as showing where you are in your code, and if there is an error the bar will highlight it in red.

Along with this there are a whole host of Shortcut keys (which I still must learn and memorize) which makes the editing of DAX measures that much easier and quicker.

Shortcut Key Combination What it does
Alt+ ↑ / ↓ Move line up/down
Shift+Alt + ↓ / ↑ Copy line up/down
Ctrl+Enter Insert line below
Ctrl+Shift+Enter Insert line above
Ctrl+Shift+\ Jump to matching bracket
Ctrl+] / [ Indent/outdent line
Alt+Click Insert cursor
Ctrl+I Select current line
Ctrl+Shift+L Select all occurrences of current selection
Ctrl+F2 Select all occurrences of current word

Analytics

Even though I do not have a picture, I wish I did, because having the Aggregations and Composite Models in the Power BI Service is honestly a BIG BIG Deal. I have been looking into this and it is a total game changer for when you do not need or want to load ALL the data into a data model for those exception reporting cases (When one or very few users require more granular data).

Not only that but it makes the performance of the Power BI reports so much faster when visualising the data over a large dataset. I cannot wait to implement this.

They are building more smarts into Power BI Desktop, now where it can explain the increase for non-additive measures. This could provide some insights that before would have taken quite a while to figure out yourself. Not only that but if the visual is what you want to show you can then add this to your report.

Custom Visuals

I am not going to cover all the custom visuals, suffice to say that in the blog post there are a lot of additional custom visuals.

Data Connectivity

There are a whole host of Data Connectors that are now generally available below:

  • Web by Example
  • SAP BW Connector V2
  • SAP BW Message Connector
  • Vertical Connector
  • Dynamics NAV
  • Dynamics 365 Business Central

And there is a new connector for Dynamics 365 Business Central On-Premise Connector

Data Preparation

Wow, this is what I would consider a 3rd massive update to Power BI Desktop in one month, where you can now profile your data in the Power Query Editor.

This is something that I know a lot of people have been asking for, and it does provide a quick way to understand what your data looks like, how many distinct values etc. It also ensures that your data is as you expect. Not only that but you can also easily shape your data based on the data profiling within the charts that are displayed.

There is now an option to use fuzzy matching to compare in the merge, where you can set some additional options:

  • Set the threshold of how high or low you want to try and match
  • Ignoring case or spaces
  • Maximum number of matches to match on.
  • Transformation table, this is where you can have a table where it might not appear to be related, but in the transformation table it could have those translations between what potentially the business calls an item and what it is coming from the data source.

One thing to note for the fuzzy matching you will need to install the October version of the On-Premise Data Gateway for it to refresh in the Power BI Service.

Other

There are now additional export data options where you can set how the data can be exported or even not at all.

Once thing which Amanda did call out in her video is that the Tenant Settings will always be enforced. This means if the tenant setting says that no data can be exported no matter what you set in the above image, it will not give you the option in the Power BI Service.

If your organization has changed from TLS 1.0 to TLS 2.0 and the above registry keys have been updated Power BI Desktop will respect those settings.

Here is this months MASSIVE blog post: Power BI Desktop October 2018 Feature Summary

Power BI – Aggregations for Petabyte Scale

This is a great blog post by Christian Wade from the Power BI team where he discusses the high-level overview of how Aggregations can enable your organization to leverage their petabyte scale data for blazing fast reporting performance.

You can find the blog post details here: Aggregations for petabyte-scale BI available in the Power BI service

Power BI – Webinar Ask Anything on Power BI Premium

There will be a webinar on 18 Oct 2018 PST, where Chuck will have Josh Caplan from the Microsoft Power BI team where you will be able to ask him anything relating to Power BI Premium.

With the recent updates to Power BI Premium and the usage reporting that is now available I am certain that there will be a lot of interesting questions, as well as insights into Power BI Premium. This is one Webinar that I want to attend, and if I cannot attend I will watch it later.

Here are more details: Webinar 10/18 Group Program Manager, Josh Caplan, and the Power BI team host an Ask Anything around managing and monitoring Power BI Premium

Power BI – Power BI Report Server Timing

It is great to see that the Power BI Report Server team have now announced a dedicated release schedule for Power BI Report Server.

This will now be Jan, May and Sept each year.

I know that typically Power BI Report Server is installed in organizations who have a standard way of working and upgrading or updating systems. By having a dedicated release schedule these can be planned for accordingly.

All the details are here: Power BI Report Server release timing update

Power BI Gateway – How to remove a Gateway Instance from the Gateway Cluster

I recently was troubleshooting a performance issue for a customer where their LiveConnection was slow when connecting to an On-Premise SSAS Tabular server via the On-Premise Data Gateway.

One of the things that I had to do was to remove an existing On-Premise Data Gateway from a cluster, so that I can then move the Gateway onto another server.

NOTE: Even if you uninstall the On-Premise Data Gateway from a server it will still exist in the Cluster Configuration, so I had to ensure that I had it completely removed.

Whilst I was going through the steps I found that it was a bit of a challenge to complete the steps, so this inspired me to complete this blog post below. I do hope that in the future this could be part of the On-Premise Data Gateway.

I completed all the steps below using PowerShell, and whilst PowerShell might appear to be difficult, if you follow the steps below it should be straight forward (Hopefully my blog instructions are easy enough to follow!)

Running PowerShell Scripts

The first thing that I did was to run the PowerShell scripts where I have already installed the On-Premise Data Gateway. This is because the PowerShell scripts are part of the installation.

I opened PowerShell ISE and run it as Administrator

Importing the module

In PowerShell I then use the code below to navigate to the Data Gateway’s Folder

cd 'C:\Program Files\On-premises data gateway'

I then imported the module as shown below.

Import-Module .\OnPremisesDataGatewayHAMgmt.psm1

Logging into the Power BI Service

The next step I had to complete is to log into the Power BI Service, because this then enabled me to get the details which are part of my tenant.

Login-OnPremisesDataGateway -EmailAddress gilbert@mydomain.com

Getting all the Gateway Clusters

Next, I wanted to get a list of all the Gateway Clusters which are within my tenant.

NOTE: In a large organization there might be a lot of Gateway Clusters, so it might be better to export the data into a file if this is the case.

I then ran the following PowerShell script below to show me all the Gateway Clusters.

Get-OnPremisesDataGatewayClusters

And this was the output

Getting the Member Gateways inside a Specific Cluster

What I now recommend doing is to first find all the members of a specific Gateway cluster.

This ensures that not only do I have the current Gateway Cluster ID, but also ensures that I am only getting the members of the gateway where I want to remove a member.

As with my example above I put in the following into PowerShell

Get-OnPremisesDataClusterGateways -ClusterObjectId 9c9697b7-XXXX-XXXX-XXXX-b481de215945

The output was a list of the members that were part of my gateway as shown below.

I could then see above that I had 2 members in my Gateway Cluster.

The one thing that I did note is that the Primary Instance in the Gateway Cluster has the isAchorGateway = True

I wanted to remove the second member in the Gateway, so I made a note of the gatewayObjectId

Removing Member from Gateway Cluster

I then ran the following code in PowerShell below to remove the member from the Gateway Cluster

Remove-OnPremisesDataGateway -ClusterObjectId 9c9697b7-XXXX-XXXX-XXXX-b481de215945 -GatewayObjectID 3ed46c5d-XXXX-XXXX-XXXX-eb13b6c676d

Once the above was completed it came back to the PowerShell prompt.

Confirmation that I had removed the member from the Gateway Cluster

The final step I did was to confirm that I had removed the member from the Gateway Cluster.

I re-ran the PowerShell script which shows the members of the Gateway Cluster

Get-OnPremisesDataClusterGateways -ClusterObjectId 9c9697b7-XXXX-XXXX-XXXX-b481de215945

And now I got the output where only the Primary member in the instance remained

Conclusion

In the steps above I have shown how to remove a member from an instance of an On-Premise Data Gateway Cluster.

What I did then do with my customer was to move the Gateway Cluster over to a new dedicated Virtual machine, and I did this by installing the On-Premise Data Gateway software on a new server, and then migrated the On-Premise Data Gateway to the new dedicated virtual machine.

I always enjoy getting comments and suggestions from my blog posts, so please feel free to comment to suggest something.