Converting Multiple Items into One Row in Power Query

I was working with some external data where I had a list of Australia postal codes, the challenge is that there are multiple suburbs that are part of one postal code.

Below I am going to show you how I took the multiple suburb names and put them into a single row, which would then allow me to have a single postal code per row. Which would then allow me to join this to my table using a relationship in my data model.

This is what the table looked like before the transformation 

 

This is what the table will look like once I have completed the steps below 

 

How I did it  

·         I first selected my column called postcode, and then in the Transform ribbon clicked on Group By  

o     

·         I then configured the Group By as shown below.  

o      

o    What I am doing here is to group by the postcode, and then putting the remaining rows effectively into a table by using the “All Rows”  

o    This resulted in a single row per postcode  

§    

·         I then went into the Add Column ribbon and clicked on Custom Column  

o      

·         I then put in the following code into my Column name “Suburbs” 

  • NOTE: Even though I cannot see the column called [locality] any more in the Available columns, I can still use it in my Custom Column.
  • This resulted in the following in my table
  • I then clicked on Expand on the Suburbs column and selected “Extract Values”
  • I then selected the delimiter to be a comma for the list of values
  • I could then see my new Suburbs column with the extracted values being comma delimited
  • I then removed my unwanted column called AllData and renamed the column headers to get my final result.

Conclusion 

I have shown once again how versatile Power Query is and also that very often things can be done via the GUI which makes it intuitive and easy to shape your data.

If there are any questions or comments, please leave them in the section below.

Thanks once again for reading and I hope that this will be useful to you at some point!  

Multiple conditions for a conditional column in Power Query

I am often working on datasets where there is more than one condition for a conditional column.

And whilst the GUI based Conditional column is really good, it currently does not have the capability for multiple conditions.

In this blog post below, I will demonstrate how to achieve this.

In my example below, I have a table that has got Bike Brands and Types.

I now want to create a rating based on both the Brand and Type.

I do this by creating a Custom Column

The way the multiple conditions work is based on the following pattern:

if [Column Name1] = “Condition” and [Column Name 2] = “Condition” then “Result”

else if [Column Name1] = “Condition2” and [Column Name 2] = “Condition2” then “Result2”

else if [Column Name1] = “Condition3” and [Column Name 2] = “Condition3” then “Result3”

else “Unknown Result”

This is shown with my working example below.

Which results in my table now having a new column called Rating which has the multiple conditions for my conditional column.

Another thing to note is that I could also do a range between two values which is essentially multiple conditions for a conditional column.

What I had to first do was to change the Amount column from Text to Whole Number, so that my conditions would work. After which I then used the following conditions

Which resulted in getting the banding that I was after.

And finally, (the last one I promise!), is where I can use the flexibility within Power Query to convert the Amount value on the fly from a Text value to a Number value for my conditional column.

This will allow me to keep my column in my table as a text value.

Conclusion

I hope that you have found this blog post useful and as always if you have any suggestions or comments please leave them in the section below.

Thanks for reading!

Power Query – A function to remove spaces within Text values

UPDATE: 19 Nov 2018

With the recent announcement of dataflows in the Power BI Service, I see more people will be looking to better understand and leverage dataflows by using the M language which is available in Power BI Desktop, Power Apps and Microsoft Flow.

I had a great response to this blog post, and both Ted and Daniil had a much easier solution to remove spaces from the data in a column. I am not sure why this originally did not work for me, but I am always happy to learn from others. As well as find an easier way to achieve the same outcome.

The solution is all that you need to do, is to right click your column and click Replace Values, as you can see below I am searching for the space and replacing with no space

 

 

 

BELOW IS THE ORIGINAL BLOG POST

The Power Query function below will remove any spaces that I have in a text field.

I got the original Power Query function from Ken Puls blog post Clean WhiteSpace in PowerQuery which he does a great job of removing leading, trailing or multiple spaces within text.

My requirement was to remove any spaces within the text too.

I created a Blank Query in the Power Query Editor and named it fx_ReplaceSpaces

Below is the code that I actually used.

And here is what it looks like with some sample data

And this is the output, where I wanted all spaces removed

There might be a more elegant way to achieve this, so if anyone has got any suggestions please let me know, I will be happy to test and update this blog post.

Power BI InfoGraphic Update – Sep 2018

It has been a few months since I have updated my Power BI InfoGraphic and with the speed at which this all changes and updates I have always had a challenge to keep it up to date.

The following items have been added with links to more details:

Please find below the latest version as at September 2018. I always also keep the latest version of the InfoGraphic here: Power BI InfoGraphic Overview

You are welcome to share this infographic with anyone, as long as the author is given credit for the creation.

If there is anything missing or needs to be updated, please leave it in the comments section below.

BI-RoundUp – Power BI (Sep Update for Power BI Desktop – Dashboard Comments – Community Contribution to PowerShell Cmdlets – Power BI PowerShell Cmdlets available in Azure Cloud Shell)

Welcome to another week of my Power BI Round Up. And as expected there is another massive Power BI Desktop Update.

Power BI – September Update for Power BI Desktop

As always I am going to over the new features and highlight what I think is really relevant in terms of what has been released.

Reporting

With the added support for categorical fields on the X-Axis this now makes the scatter charts more flexible.


As shown above there now is the capability to be able to copy a single value, which will copy the value in its unformatted form, which could be used to put into a search of another application.

Whilst when using the Copy selection feature, it will copy everything that you have highlighted in your table or matrix. And when pasting this, it will keep the column headers as well as the formatting options which you can then paste into Excel or another program. This is really a great piece of functionality which makes it easier when certain data needs to be copied into another system.

There are now some built in report theme’s which are great for those people who just want to move from the default colours, but do not want to create their own custom themes.

Report Page Tooltips are now generally available meaning that you will not have to enable it as a preview feature.

Along with that as shown above there is now report page tooltips available for Cards.

There have also been additional improvements for accessibility in the formatting pane, making it easier to navigate, as well as when using the screen reader there is more support.

Analytics

This is a really big, BIG one having the ability to have aggregated tables in your data model. What this means is that you can have a highly aggregated table of your data which might answer 60 – 70% of the queries. When the queries are run they will use the data in the aggregated table, which in turn results in exceptionally super-fast query response times. And if it does not hit the aggregated table it will then go down to the source table, which could be imported or using Direct Query mode, which means then that the entire data model is actually very small but very fast.

There is a lot more to this, so I suggest reading up the documentation that is included in the blog post. But this is something that I have been waiting for and cannot wait to try it out and get it working.

Finally there is now support for RLS with Q&A, this means if you have had a dataset that has had Row Level Security (RLS) previously you could not use Q&A. This has now been fixed and Q&A is available which is awesome.

Data Connectivity

As shown above there is a new connector where you can connect to PDF files and it will attempt to extract table data from your PDF files.

There is now support for the SAP BW Connector measure properties

There is a new connector called dataflows which will soon be in limited preview which will allow users to connect to an existing data flow.

Data Preparation

As per my first image, and above, there is now Intellisense when going into the Advanced Editor in the Power Query Editor. This is so awesome. There have been so many times in the past when I have not known which M function to use. As well as having syntax errors, and a lot of that pain is now gone with having Intellisense.

The add columns from examples now supports text padding your data, as shown above you can pad the text with zeros for data where you want them all to have the same length.

It is interesting to see that right at the end of the blog post they highlighted that they are working on being able to copy visuals between PBIX files. How cool is that??

You can find all the blog post details here: Power BI Desktop September 2018 Feature Summary

Power BI – Dashboard Comments

This was announced at the Microsoft Business Applications Summit and it is great to see that you now can add comments to Power BI Dashboards, as well as individual tiles on the dashboard.

It is also great that you can tag people in the comments, which will then send an email or send a push notification to the Power BI Android and IOS Mobile App.

At the end of the blog post, they also indicated that this will be coming to the reports, so stay tuned.

You can find all the details here: Announcing Dashboard Comments in Power BI: Your new hub to discuss data and collaborate with others

Power BI – Community Contribution to the PowerShell Cmdlets

This is the first contribution from the Power BI Community which adds to the existing Power BI PowerShell management cmdlets. Where Kenichiro Nakamura has added functionality to work with datasets, tables and columns.

You can find a lot more detail in the blog post: Celebrating our first community contribution to the Power BI management cmdlets

Power BI – PowerShell Cmdlets available in Azure Cloud Shell

What this means is you can now use the Azure Cloud Shell to run the Power BI PowerShell Cmdlets. This also means that there is no need to try install and configure the required PowerShell modules, because this is all maintained in the Azure Cloud Shell.

I would suggest looking into the storage options because depending on how much storage you need, if you have an existing Azure Subscription this could be a very good option.

You can find more details on how the Azure Cloud Shell works, as well as the blog post details here: Power BI Management cmdlets in Azure Cloud Shell

What runs under the cover when I open Power BI Desktop

I am always interested in what happens under the covers when I open Power BI Desktop. So I did a little digging and I got the inspiration from Macro Russo when he did his Webinar as well as presented at the Queensland Power BI User Group on “My Power BI report is slow: what should I do?”

What happens when I click on Power BI Desktop.exe?

This is the list of associated processes that are opened when Power BI Desktop is started, which I will explain what they are below.

This is what it looks like for me when I am running Windows 10 and I have a look in the Task Manager under Processes

  • CefSharp.BrowserSubprocess
    • Because Power BI Desktop runs in the Power BI Service, which is essentially a website and all the visuals are rendered in a browser.
    • My understanding is that within Power BI Desktop it is simulating how it will run the Power BI Service.
    • As per Marco’s Webinar, if this consumes a lot of memory or CPU this is potentially why your Power BI Report is slow.
  • Console Windows Host
    • UPDATE (06 Sep 2018) – I got a reply from Amanda Cofsky from the Microsoft Power BI team, who said that the Console Windows Host is the “Analysis Services Engine Console Output”, which is generally used by the Microsoft Engineers for debugging purposes.
  • Microsoft Mashup Evaluation Container
    • This is the Power Query Engine.
    • This is responsible for processing all the steps in the Power Query Editor. Which gets data from my sources, transforms it and then loads it into my data model.
    • When I look at a Server where I have got the On-Premise Data Gateway installed I will see a lot of instances of the Microsoft Mashup Evaluation Container running. This is because this is where my data gets loaded and transformed into tables before sending to the Power BI Service.
    • This is the executable which is the starting point and container for all the processes that are run within Power BI Desktop.
    • This is where all the magic happens, it is an analytical data engine which leverages In-Memory technology to achieve incredible compression using the X-Velocity Engine and blazing fast query response times by loading all the data into memory.
    • This is where all the data gets loaded from Power Query into the data model.
    • This process can have the highest memory usage.
    • If I have an expensive DAX measure which must get most of its data from the storage engine I will see an increase in memory utilization and CPU during the evaluation and running of the DAX measure. Which once again as per Marco’s Webinar is a great indicator as to why my Power BI Report is slow.

I hope that this has given you some insights into what runs under the cover in Power BI Desktop and that there are quite a few moving parts that work together seamlessly to make the report creation and development experience so seamless and fast when developing Power BI Reports.

As always if there are any questions or you have more details and insights into the details above, please let me know and I will happily update the details in this blog post.

Thanks for reading!

Review of new Features coming to Power BI by Oct 2018 – From Business Applications Summit

I was fortunate to attend my first ever Business Applications Summit in Seattle.

I had the pleasure of sharing an Airbnb with Matt Allington, Phil Seamark and Miguel Escobar, it was great to spend time with these Power BI Legends. I also did meet a lot of people and got to chat with people in the Microsoft Power BI team, which was something I will always remember.

I also was fortunate to present 2 sessions (Unfortunately Reza and Leila could not make it) at the Business Applications Summit, both sessions went well, and the feedback was positive.

Ok, so enough about me, let me get onto ALL the new features that are planned to be coming to Power BI until October 2018.

One thing I can say is that I am SUPER excited with the new proposed features that are coming, it is most definitely going to make Power BI the go to BI tool going forward.

Along with this it is also growing up, and by that, I mean more enterprise features are coming, which means that it soon will be able to be implemented into a smaller organisation or a fortune 500 company.

Personally, I cannot wait to learn all the new features and start to implement them at customers. One caveat is I am certain that some features might take a bit longer to get into the service or could possibly change.

NOTE: This might be a bit of a longer post, so buckle up, here is the link to where I got a lot of the information: Overview of Business Intelligence October ’18 release

Other Features not mentioned in any of the notes

Below are some of the other features that I did not find in the release notes but there were some demo’s or pictures.

Print to PDF

As you can see above, coming to Power BI will be the ability to print to PDF which will look exactly like your Power BI report.

Display Folders – Multiple selection settings

As you can see above there is the ability to be able to set multiple measures or columns into a Display Folder.

Not only that but you will also be able to complete the settings over multiple columns.

And something not in this picture is the capability to see multiple data source views if you have hundreds of tables to make the data modelling experience easier.

Python Support

As you can see above there is Python Support coming to Power BI!

Personal Bookmarks

As shown above, there will be personal bookmarks coming to Power BI

Power BI Desktop

Below are all the Power BI Desktop upcoming features

Ad-Hoc Data Exploration

What this will be a user who does not have edit access to a report will be able to look at a chart on a different axis or change the chart type which could be more meaningful to the user.

It will be used with an option to choose a report and select “Explore from here”

Aggregations

This is a big change for Power BI where there the underlying data is a really large dataset that is stored in Spark or a SQL Server database. When connecting with DirectQuery you will be able to define aggregations which will cache just the aggregated data into memory into your model.

This will allow to have a dual mode so that if the query can be answered by the aggregated cache that will be used, and if not, it will then query the underlying DirectQuery source.

Composite Models (Available Now)

What Composite Models allow you to do, is to allow you to have data in one Power BI Desktop file where you are getting data from DirectQuery and imported data sources.

This is an amazing feature and I know something that a lot of people have been asking for.

With this you also now by default will have all relationships set to Many:Many.

As with the details it is always suggested to ensure that your DirectQuery source has been tuned and has the capacity to be able to answer the queries from your users to ensure that the users get the super-fast reporting experience.

Currently the Composite Models do not support LiveConnection sources which only relates to SQL Server Analysis Services (SSAS) Multi-dimensional or Tabular.

Copy data from table and matrix visuals

Coming to Power BI Desktop and the Power BI Service once implemented will be the ability to be to copy data from a matrix or table into another application.

Custom Fonts

You will soon be able to use any fonts that you want in your Power BI reports. All that will be required to work is that the same font will need to be installed on their computer in order for them to see it. If they do not have it installed it will fall back to the default font.

Expand / Collapse in Matrix Visual

As shown below you can see the upcoming option to expand or collapse rows in a matrix visual. There also was the indication that they want to bring more pivot table features from Excel to Power BI.

Expression-based formatting

What Expression based formatting is, is where by using DAX you will be able to format almost anything in your Power BI report. The potential is to use expression-based formatting for the following below and possibly more that I cannot think of

  • Titles of Visuals
  • Line widths of visuals
  • KPIs based colours

From what I did see, there will be an fx button next to almost everything in the visual properties and elsewhere.

Q&A Telemetry Pipeline

This will allow access to the Q&A telemetry to see what the users are using Q&A for, which will allow you to further customize your Q&A Linguistic settings. The data will first be scrubbed for PII data.

Dashboard and Report wallpapers

Coming to both the Power BI Dashboards and reports will be the ability to use wall papers to cover the grey area behind your reports.

Show measures as clickable URLS

As you can see above you will be able to create a link, with a measure so that this can dynamically link to a Power BI Report to any other application which you can access via a URL

Theming over report level and visual container styles

There will be a theming update coming to both report level and visual containers in Power BI. From what I understood it would be similar to the theming that you can currently do in Power Point.

New Power Query Capabilities

There are a whole host of Power Query Updates as detailed below.

Intellisense support for the M formula language

Intellisense will be coming to the advanced editor in Power Query. This is something that I know a lot of people have been asking for. Not only that by Power Query will be coming to Microsoft Flow also.

Smart Data Prep

There are smart data preparation coming to Power Query, with the following initial features below.

  • Data extraction from semi-structured sources like PDF files.
    • This is something a lot of people have been asking for and I have seen it in action and it is awesome, where it can take data out of tables in a PDF and extract it into Power Query.
  • HTML pages.
    • A smarter experience to understand what details you want from an HTML Page.
  • Fuzzy-matching algorithms to rationalize and normalize data based on similarity patterns.
    • This is where it will try and match data based on your columns, to try and guess what the value should be when say the data is miss spelled.
  • Data profiling capabilities.
    • As you can see from the above image, there will be data profiling which will enable you to have a look and see if the data is as expected.
    • An example is if you know that your Customer Number should only be 5 characters long, with the data profiling you will be able to see if it is meeting your criteria.

Power Query Community Website

As with Power BI, there will be a Power Query Website launching later this year.

Certified Custom Connectors

There will be certified custom connectors which will be available to be plugged into Power Query. As part of this process the custom connectors will be certified Currently there are no additional details on how this process will be completed, but no doubt it will explained as time goes on.

Power BI Service

Below are all the updates to the Power BI Service.

Power BI Home

Power BI Home is a new place for users to start their Power BI Journey with the following sections.

  • The Top section will contain the users most important dashboards and reports.
  • The second section will contain their favourite items, as well as most frequently accessed reports and dashboards.
  • Whilst the bottom section will include learning resources.
  • And on the top right-hand side will be a global search where you will be able to search for any item that you have access to in the Power BI Service.

Paginated / SQL Server Reporting Services Reports

As you can see above SSRS or Paginated reports will be coming to the Power BI Service. Not only that but it will also print them out pixel perfect.

Workspaces with Azure AD groups

As you can see above, App Workspaces will be using Azure AD Groups and not dependant on Office 365 groups. You will still be able to add permissions from Office 365 groups, but it will no longer be dependent on Office 365

Data Flows – (Formerly called CDS-A, then Data Pools, now Data Flows)

It started out being called CDS-A, then Data Pools, and now the final name according to the great people in the Microsoft Power BI Team is Data Flows. As previously described this is where you will be able to use Power Query Online to ingest data from anywhere and store them in Entities.

All the data will be stored in an Azure Data Lake Gen2, Pro license getting 10GB per user and Premium will get 100TB per P1 node), which will give you the additional capabilities to be able to let the data scientists in your organisation access the data directly from the Azure Data Lake. You will also be able to bring your own storage within your existing Azure investments.

I think that having the Data Flow will enable organisations to be able to have a single source of truth for their data assets, that can then be leveraged by the entire organisation.

What I do know from attending Miguel Llopis session is that the same runtime that is running in the Power BI Desktop will be running in the Power Query Online in the Data Flows. So that will mean you can use Power Query within Power BI Desktop to get the data in the shape that you want. Then you can go into the Advanced Editor and Copy and paste the data into Power Query Online.

Below is a rough overview of what it looks like

Data Flows Refresh

Not only will you be able to bring data in with data flows, you will also be able to refresh data with incremental refresh which will be a Power BI Premium Feature

Admin APIs

Admin APIs have come to the Power BI Service, which will allow an Admin of the tenant to be able to discover all the artefacts in his Power BI tenant.

This is great for large implementations, because up until now you had to have access to the App Workspace in order to be able to view the data. Now by being an Admin, you will see everything, which is the standard Admin capabilities.

Additional report URL parameter support

Additional report URL parameters will include filters for Date columns, new operators ” < , > , <= , >=” and multiple field values

Commenting in Dashboards and Reports

As shown above there will be the ability to comment on Dashboards and reports.

Not only that but you can include people using the @ which will then send them a notification.

Along with this you can also add comments to a specific visual, which will give great context to comments

Dataset metadata translations

If you have defined translations in the dataset or Analysis Services model, the user will see it in the locale.

Filters for Report Sharing

You will be able to share reports to users with the current filters and slicers in place for when they view the report.

Historical Capacity Metrics

If you have got Power BI Premium capacity, there will be a historical view which will allow you to see what affects performance or refreshes and queries. And see which datasets consume the most memory and make changes or plan accordingly.

Multi-Geo for Compliance

This will be where data must reside in a certain country, where a company operates around the globe. And will ensure that the data can be located in any of the Azure Data Centres, even though the initial Power BI tenant might be located elsewhere.

Not only that but in a future release this can also be used for performance by having the Power BI Premium located closer to the users.

Azure Analysis Services / Analysis Services Tabular

Below are some of the new features coming to SSAS in Azure or On-Premise in Tabular

XMLA endpoint for third-party client connectivity and lifecycle management

By opening the XMLA endpoint for Power BI, any current tool that can connect to SSAS via XMLA will also work with Power BI Desktop. Which means that some of the following tools would work immediately once the XMLA is opened.

  • SQL Server Management Studio
  • BISM
  • Excel

Not only that but you will also be able to use the TOM and TMSL in order to manage and modify configuration settings or items within your Power BI Model.

Application Lifecycle Management

As you can see above there will be the capability to be able to have full Application Lifecycle Management ALM in Power BI Desktop, Azure Analysis Services. This is a great step forward because it gives you the following capabilities

  • Source Control
  • Deployment of specific items
  • Deployments to Dev, Test and Prod
  • Potential to share parts of the data model.

Analysis Services vNext

As shown above this was Christian Wade from the Analysis Services team showing the following potential features coming to Power BI and Analysis Services.

  • Calculation Groups
  • Many-Many relationships
  • Resource Governance
    • Is Available in MDX
    • Query Memory Limitations
    • Rowset Serialization Limits
  • Data Connectivity
    • New Power Query Enterprise Data sources for Spark, Amazon RedShift, IBM DB2, Google BigQuery and Azure KustoDB

On-Premise Data Gateway Updates


There have been a lot of advancements in the On-Premise Data Gateway, with the latest release including Support for Custom Data Connectors in the Enterprise version of the On-Premise Data Gateway.

Below is a list of features coming to the On-Premise Data Gateway

  • Gateway multi-geo support for Power BI Premium
    • With the release of multi Geo Support for Power BI, this will also be enabled in the On-Premise Data Gateway
  • Guarantee high availability of gateways via clustering
    • Better support and visibility for Gateways in a Cluster
  • Improved support for Single Sign On and SAML
  • Improved data source settings experience
    • Here there will be the ability to skip the testing of the connection.
    • Rename data sources.
    • Create multiple data sources with different credentials
  • Tenant level administration of on-premises data gateway
    • This will allow the ability for tenant administrators to manage All On-Premise Data Gateways via the API or GUI
  • Basic traffic load balancing in the on-premises data gateway
    • This will start off with a basic setting to split the traffic requests between Gateways.

Using the Power Function.InvokeAfter to determine how long to wait between API calls

I have been working on a dataset which I will hopefully reveal soon, but part of that was that I was getting rate limited when making an API call.

I found Chris Webb’s insightful blog post (Using Function.InvokeAfter() In Power Query) where he details how to use the Function.InvokeAfter. The one key piece that I personally found missing was how to use this with an existing function that I had created.

I then got another fantastic question from the Power BI Community where they were looking to do an IP Address Lookup. And there are a lot of sites who offer this, but they do limit the rate at which you can query the API (Which I think is perfectly understandable considering they are offering it for free!)

My blog post shows how I ensured that I did not exceed the rate limit for the API using the Function.InvokeAfter

NOTE: I am not going to cover how I converted the IP Address to a location, I have done this previously in (Power BI Query Editor – Getting IP Address Details from IP Address)

Using the Custom Function

I am starting off where I have already created the Custom Function in Power Query Editor. I also have got a table with IP Addresses.

  • I have used a sample file in which I made up the IP Addresses as shown below.
  • I then went into the Add Column in the Ribbon and clicked on Invoke Custom Function
  • This brings up the Invoke Custom Function
    window and I put in the following information as shown below.
    • As you can see from above, I gave my new column a name of Details
    • I then clicked the drop down and selected my function I created earlier called fx_GetIPAddressDetails
    • And then finally the crucial
      part is where I selected
      my IP Address Column.
    • I then clicked Ok.
  • When you do this it returns a table as shown below.
  • Click on the Expand
    Table Button on the top right hand side, which will then prompt you which columns you want to select
  • I then selected Country
  • And below were my results.

Adding in the Function.InvokeAfter to limit the rate at which I query the API

I am now going to modify the step where I Invoke the Custom Function to limit how long it waits between API calls.

  • I created a new Parameter called “Interval (Secs) as shown below.
  • I then went to the following step in my table, and clicked on the Step called “Invoked Custom Function”
  • Then in the formula bar I have got the following.

= Table.AddColumn(#”Changed Type”, “fx_GetIPAddressDetails”, each fx_GetIPAddressDetails([IP Address]))

  • Next, I made the following changes to the above code using the Function.InvokeAfter

    = Table.AddColumn(#”Changed Type”, “fx_GetIPAddressDetails”, each Function.InvokeAfter(()=>fx_GetIPAddressDetails([IP Address]), #duration(0,0,0,#”Interval (Secs)”)))

  • I added the Function.InvokeAfter(()=>
    before I called my function fx_GetIPAddressDetails which is highlighted in BLUE above
  • I then put in the #duration(0,0,0,#”Interval (Secs)”))) which is highlighted in PURPLE above.
    • Within the #duration I also used my Parameter called #”Interval (Secs)”
      which is highlighted in ORANGE.
    • This allowed me the flexibility to change the rate limit timing without having to go into the code.
  • Now when I refreshed the data it will wait 2 seconds between each API Call.

I hope that this has been useful and an easier way to ensure that you can limit how quickly you call an API

As always please leave any comments in the area below.

Here is a copy to the PBIX file that I used in the blog post above: FourMoo – Loading IP Addresses with an Interval.pbix

Power Query Pattern – Adding Spaces in Text within your data with Camel Case

In this week’s blog post, I created this Power Query Pattern, which I created to add in spaces for CamelCase text within a column.

To get this to work for you, all that you do is need to make one change to the code.

Below is what the data looked like

Then I created the following Power Query Pattern below.

 

  • To use this pattern below are the following changes that you will need to make it work in your Power Query Editor
  • Line 1
    • This is my step name
  • Line 2
    • This is where I am using the Table.TransformColumns
    • NOTE: Even though this does appear to only be used for transforming columns, it works for data within a column.
  • Line 3
    • This is referring to the previous step name, which is returning the table contents
  • Line 4
    • This is my column name where I want to add in the spaces. As with my image above the column was named “Operation”
    • NOTE: This is the only part of the pattern that needs to be changed.
  • Line 5

What I did to get this to work is I was in my Power Query Editor Window

I then clicked on Advanced Editor in the Home ribbon

I then added this step into my code as shown below.

I then clicked on Done

I went back to my column and I could now see the data with the spaces after each capital word

Conclusion

As you can see I have demonstrated how Powerful Power Query (see the multiple use of Power!) is to get the data in the shape that you require.

If you have any suggestions or comments please let me know.

Power Query – Renaming Multiple Columns

I was working on a dataset where I wanted to change multiple column names using one step and not having to change them manually. Since there were over 30 columns this would be really time consuming.

Below I detail how to complete this in the Power Query Editor, which will replace all the column names for me in one step.

This once again shows how powerful the Power Query Editor can be for ETL and automation tasks!

  • This is what the Original Column names looked like
  • To get this working, I had to go into the Advanced Editor and manually add a step.
  • Here is the Syntax that I added with my Step Name

    #”Rename Column Names” = Table.TransformColumnNames(#”Changed Type”, (columnName as text) as text => Text.Replace(columnName, “Sales – “, “”))

  • As you can see above my previous step name was called #”Changed Type”
  • Then the only other change I had to make was for the Text.Replace and what I was searching for and what I wanted to replace it with.
    • With this example I was looking for “Sales – “, “”
      • As you can see I searched for Sales – and replaced with “” (which is blank).
  • And this is what it looked like after manually adding in the above step.

As I have shown above a quick and easy way to rename multiple columns at once.

You can find the sample file here: Renaming Multiple Columns.pbix

As with every blog post if there are any comments or suggestions please leave them in the section below.