API Strategies with Power BI

API (Application Programming Interface). You surely have heard the term from your enterprise architects telling you that all data will be accessed through “the API” and you will no longer access data from the source. They will build all the interfaces for you and you won’t need that source connectivity anymore.

Tag-Cloud-Web-API

Enterprise architects are really smart (I used to be one), ha. In all of the wisdom we share as architects, the confession is that making data accessible to our business analysts was never our top priority… All of the design documentation that we wrote started with “end user experience” and had a lot of content around “3 Tier Architecture”. Reporting and Business intelligence usually ended up as the last few paragraphs after a few dozen sections of content on how the application would work and interface with all the back-end systems.

I am writing this to give the perspective that more often than not, data connectivity is not forefront in most application designs. I do believe that is starting to change as data is becoming a financial currency to most organizations, but nonetheless, these “pesky” APIs still stand in between you and the data you need to access.

So, you have been given a nice long URL that starts with “https://” to access the data you need. Now what?

In this article, I will show how to connect to your APIs through Power BI in a few different ways. I will start by showing the ODATA protocol but will quickly dive deep into APIs that are secured by the popular OAuth protocol and end with how awesome Power BI transverses nested JSON documents that get returned.

Connecting to ODATA Services

The most well suited API strategy for connecting business intelligence tools is the ODATA protocol. When your organization’s APIs are presented as an ODATA feed then it has a standard implied that power bi can understand. OData can be offered either through the more modern REST standard or more traditional and still widely used SOAP standard.

There is a public ODATA feed for the Northwind sample data set here:
http://services.odata.org/northwind/northwind.svc/

To connect to ODATA feeds, simply go to “Get Data” -> “ODATA” and enter the URL.

Get Data ODATA

As this is a standard format for APIs, the benefit is that you are presented with a table list from the service just like you would be if you were connecting directly to a database.

ODATA Table Navigator

 

Most APIs will have additional request parameters that can be added to get more specific data. The below url connects to the same service as above but returns only a single record:
http://services.odata.org/Northwind/Northwind.svc/Orders(10643)

ODATA to table

You have successfully just connected to your first API. From this point forward, ODATA behaves in the same way as other data sources and you can start shaping data in the “Edit Queries” window.

If you can influence your organization to adopt this standard for APIs, your life will be much easier consuming them. However, the purpose of this blog site is to help you use technologies such as Power BI in an enterprise scenario, and it is most likely that your APIs are not ODATA compliant…. So let’s get on with the FUN stuff.

Non ODATA APIs

After you ask your architect if the API you are trying to connect to is ODATA compliant and she tells you NO, the next alternative is to use the Web data source.

connect to web

When connecting to a web URL for your internal (or external API) it is likely going to be a secured and will have a prefix of “https” instead of “http”. In our ODATA example above, I did not connect to a secured service but had I, the below dialog would have been presented back.

connect to web 2

There are 5 different authentication methods that are quite easy to connect with. It is possible that internal APIs may use Basic or Windows credentials. Someone will have to provide to you a basic user/password combination for connecting to the service or if using Windows credentials someone will need to verify you have been granted access with your own user/password combination.

If connecting with one of these five authentication types, it is pretty straight forward and you can skip ahead to the end of this post where I show how to transverse the JSON documents that likely will be returned as a result.

Connecting to OAuth Services

Many organizations are moving towards OAuth authentication protocol. Notice in the screenshot above, this protocol is not natively supported in Power BI. This protocol has an extra layer of complexity as it requires two services. One service will be called with an api key and an api secret and will return a token to then be used with all subsequent requests to the resource URL.

We will get in the weeds rather quickly in the rest of this blog post so prior to the half of you that will stop reading at this point, please go to the Power BI Ideas site and VOTE for OAuth protocol to be added as a first class authentication provider. Here is the link: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/13195278-oauth

Also, the below content is available in a Power BI Template file here. Read the content below but feel free to download the template to save a lot of the grunt work.

Setting up a Twitter App for our OAuth API Example

First of all, twitter connectivity via Power Query has already been blogged about a few times. The most useful post I found on it was by Chris Koester here. The script I am using later in this article is nearly identical to what I learned from his post. I will talk through the necessary script that has to be written but also focus on how you can transverse JSON data that gets returned from most modern APIs you will encounter.

Twitter uses OAuth2 for authentication. This will require you to setup a free developer account if you have not already.

Go to https://dev.twitter.com and sign in with your twitter account. You will then need to create a twitter application which will give us the necessary API Key and API Secret that will be used to authenticate with the twitter Search API.

In the new Twitter UI, the applications section is a little hard to find. At the bottom of https://dev.twitter.com are some links. On the right hand side you will see “Manage Your Apps”

Twitter UI

You will see a “Create New App” button that will take you through a dialog to setup your application

Callback URL issue with OAuth based APIs

In the application creation dialog, you will notice a Callback URL entry that we will leave blank for this example.

callback url issue

In a traditional OAuth flow that has a UI associated with it, this is the URL that the flow will return user interaction back to after the user agrees to allow the application to act on their behalf. This blog is not meant to go into that detail but you can review this link for more information.

As with OAuth, we have to get an authentication token from the first service we are going to call, there is an issue with some apis as they only pass this token back to the Callback URL. Power BI cannot read a response returned to a callback URL and if the API you are using requires that “authentication flow” then it can’t be used with Power BI (well, that is not entirely true, but to prevent this blog from being 50 pages long, we will say it can’t). For instance, the LinkedIn API requires this flow so it is not something we can connect to from Power BI easily.

As long as the token is returned in the response from the first authentication service call (as the Twitter API does), we will be able to integrate it with Power BI.

Connecting to the Twitter Search API

Once you have entered the data to create an application, you will land on your “App details” screen. Navigate to the “Keys and Access Tokens” tab and copy the API Key and Secret so they can be used later.

Note: These should be treated like passwords and not made public

twitter access tokens

Back in Power BI, go back to the Web data source and we will just type in an arbitrary web URL as we will be replacing all of this with a script. I used http://twitter.com and was presented with the navigator dialog to select a table from that page. Click “Edit” to open the Query Editor.

connect to arbitrary table

It is a good time to take advantage of a new feature that was added in the April Power BI Desktop update for Parameters. For deeper insight on this feature, check out the power bi blog entry.

manage parameters

Create 4 new parameters as follows:
API KEY: <<your API Key>>
API Secret: <<your API Secret>>
Token URL: https://api.twitter.com/oauth2/token
Search URL: https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100

Notice the Search URL has a couple of query string parameters. In my example we will be searching on the word “Analytics”

Once the parameters are entered you will see them in the queries list with their values along with the arbitrary table you pulled in from the Web data source

query editor parameters

 

Advanced Editor

Now that we have our parameters in place, navigate back to the Table Query and click on “Advanced Editor”

advanced editor

Once in the advanced editor, replace the script with the following:

/*
This M script gets an bearer token and performs a tweet search from the Twitter REST API
https://dev.twitter.com/oauth/application-only
 
Requires establishing a Twitter application in order to obtain a Consumer Key & Consumer Secret
https://apps.twitter.com/
 
IMPORTANT - The Consumer Key and Consumer secret should be treated as passwords and not distributed
*/
 
let
 // Concatenates the Consumer Key & Consumer Secret and converts to base64
 authKey = "Basic " & Binary.ToText(Text.ToBinary(#"API Key" & ":" & #"API Secret"),0),
 url = "https://api.twitter.com/oauth2/token",
 // Uses the Twitter POST oauth2/token method to obtain a bearer token
 GetJson = Web.Contents(url,
     [
         Headers = [#"Authorization"=authKey,
                    #"Content-Type"="application/x-www-form-urlencoded;charset=UTF-8"],
         Content = Text.ToBinary("grant_type=client_credentials") 
     ]
 ),
 FormatAsJson = Json.Document(GetJson),
 // Gets token from the Json response
 AccessToken = FormatAsJson[access_token],
 AccessTokenHeader = "bearer " & AccessToken,
 // Uses the Twitter GET search/tweets method using the bearer token from the previous POST oauth2/token method
 GetJsonQuery = Web.Contents("https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100",
     [
         Headers = [#"Authorization"=AccessTokenHeader]
     ]
 ),
FormatAsJsonQuery = Json.Document(GetJsonQuery)
in
    #"FormatAsJsonQuery"

In the above code, there are two calls to Web.Contents function. The first is to call the token service to get a token returned to then be used in the second service call to actually get results.

OAuth is a preferred authentication method as the token that is returned from the first service is only valid for a finite period of time. So if you were to manually reproduce this token and then just called the search service with it, it would no longer be valid the next time you tried to refresh this data.

That is why the script above is so powerful as it handles the token generation so that you have a repeatable automated process. Kudos to @cjkoester for putting this script together.

A couple other things to note in the script is the use of our parameters. They are prefixed with the “#” sign. Also, not all OAuth services will follow the same signature above but this can be used as an example and ideally your development team can tell you changes you may need to make in the pattern to work.

I renamed the query to “TwitterSearch”. The result should now look similar to this.

query editor twitter search start

 

Working with the JSON response

JSON (javascript object notation) is the most popular format used for transmitting data for modern APIs. XML is also still widely used and if your APIs return XML it is likely many of these same steps will apply.

I am really impressed with the JSON parser that is built into the query editor of Power BI. For instance, when you look at the results of the JSON that gets returned in the Twitter API that we just connected to, it looks like this:

JSON via Fiddler

I can count 6 levels of nesting in the small portion of the JSON response I am showing above (via Fiddler). The Power BI Query Editor handles this with no issues.

We will create two separate queries from this response, one for the statuses and the other for the hashtags.

Taking the TwitterSearch result, the two top level entities are “statuses” and “search_metadata”. Click the “List” value for “statuses” and this will expand the record set for “statuses”.

query editor to table

Notice the “To Table” option at the top, by changing this list to a table, we will be able to expand the Record to all the values. Click “Ok” in the To Table dialog and you will see the magic expand button pop up in the column header.

query editor expand button

Click the button and then uncheck the “Use original column name as prefix” to avoid the “Column1” showing up as a prefix for to every column created.

query editor choose columns 1

Click Ok.

Congratulations! You now have the last 100 statuses for the search term “Analytics” that we used in the search URL string we entered as the parameter above: https://api.twitter.com/1.1/search/tweets.json?q=Analytics&count=100

Let’s use this same result to create an additional query for hashtags. Simply right click on the “TwitterSearch” query and select “Duplicate”

query editor duplicate

Rename the query “Hashtags”.

Find the “Entities” column and click the magic expand button again.

query editor expand entities

On the column expansion dialog simply click “Ok” again to see all the results of the nested JSON document.

query editor expand hashtags

Click the magic expand button again on “hashtags”.

This will bring in the null values and the “List” links will become “Record” links.

Finally click the magic expand button one last time to be greeted by the hashtag records we wanted.

query editor hashtags expanded

Right click “text.1” and rename it “hashtag”.

Now we can use the “Choose Columns” dialog to only select a few of the columns to reduce our query to just the information important to the hashtags.

query editor choose columns

From here you can “Close and Apply” your Query Editor window and will have two queries. One for the statuses of your twitter search and the other for the hashtags that you can cross filter against.

A simple viz may look like this

viz

Power BI Template

Another new feature in the April update of Power BI is the Power BI Template file. I have saved all of the above work in a template file that I am sharing by clicking the Export option from the File menu.

template screen shot

This retains all of the work that has been completed above but doesn’t retain any of the data. So when you open this template file, it will prompt you for the parameters that were discussed above.

Here is the template file link again. Angry Analytics Twitter API Example

Conclusion

In closing, this article was intended to show you multiple ways that you can connect to APIs as it is likely your organization has set standards around data connectivity that no longer allows connecting directly to the source.

Be an advocate for a Data Culture. Putting data in the hands of more people as quickly as possible will result in the ability to achieve more. Challenge your development teams to make data accessible. Data is currency!

Power BI Content Workflow

I work with many enterprise level organizations in the Midwestern United States. There is a lot of excitement around Power BI, however I often run into challenges regarding features that are expected with a business intelligence solution that can meet the needs of an enterprise such as governance, administration, and lifecycle management.

pendulum

In Business Intelligence, there is a pendulum that swings between self-service and governance that wreaks havoc on most organizations of reputable size. I can confidently say there is little doubt from an end user feature set that Power BI competes well with any solution on the market. With a FREE and PRO offering at only 9.99 a month per user, the price point is hard to beat. But what about total cost of ownership (TCO)? What about the enterprise deployments of Power BI? I intend to use this blog to help with answering these questions regarding Power BI, SSAS Tabular models, and an overall agile approach to business intelligence within the enterprise.

Let’s start with discussing lifecycle management and taking reports through development, testing, and production similar to how you would deploy a codebase for an application.

power bi cheapskate

First of all, we need to ensure that we are working with the PRO version of Power BI. Yes, if you want Enterprise level features, you are going to have to pay for it. Two PRO features are being used below: Power BI Group Workspaces and Content Packs.

Group Workspaces in Power BI allow for multiple users to “co-author” reports and share ownership over the Power BI content. You can easily create a group workspace from the left hand navigation bar.

power bi groups

Once a group has been created, you can assign users with either “Edit” or “View Only” access. This will be an important distinction for users you assign to this group between those who will be “co-authoring” and those who will be testing and doing quality assurance.

power bi groups add user

 

Once a group has been established with distinction between the content developers and the testers, the below diagram can be followed to achieve a content management workflow.

power bi content mgmt workflow

This diagram is also provided in pdf form below:
Power BI Content Workflow

In this workflow, One Drive For Business (ODFB) is used as a collaboration space for PBIX files. Alternatively, this could as be a source control environment or a local file share. A benefit in O365 deployments for Power BI is that when a group is created, a ODFB directory is also established for sharing.

The group workspace is used for collaboration between content authors and testers. Once development and testing iterations are complete, a Content Pack should be produced to distribute the production version of the dashboard, reports, and datasets to the rest of the organization. Content Packs are created by going to “Get Data” -> “My Organization” -> “Create Content Pack”.

power bi content pack create

IMPORTANT: It is critical that in an enterprise scenario that personal workspaces are not used to publish content packs. This creates a dependency on a single author to have full control over content and scenarios will consistently pop up that a single author may not be available when updates are needed on the content. Enterprise deployment scenarios can’t be achieved when content is tied to a personal workspace. Establish a group workspace structure that makes sense for your organization as pointed out in the Pre-Requisite text on the workflow diagram.

Making Revisions

As revisions occur on the content in the group workspace a notification appears in the upper right reminding that there is a content pack associated with the dashboard or report being modified and that an update needs to be applied to the content pack to publish the changes.

power bi content pack changes

I have talked with some organizations that originally considered bypassing the use of content packs and just were going to use group workspaces with “View Only” users getting the content distribution. This would not allow for the option to iterate on changes before they are published to the end users.

Once the iterations have been completed and the next revision is ready to publish, anyone with “Edit” privileges in the group can go to Settings (the gear icon in upper right) and “View Content Packs”.

power bi content pack update

A warning icon is displayed next to the content pack name indicating there are pending changes. Once you click “Edit” you will have the option to “Update” the content pack in which the changes will be distributed to those that have access to it.

power bi content pack update button

Connection Strings and Separation of Duties (SoD)

The above workflow still leaves a couple of desired features without answers.

With the group workspaces, we have separated the ability to develop and test report changes from publishing them to the rest of the organization, but how do we transition from a DEV/TEST data source to a PROD data source? This has to be achieved within the Power BI Desktop file (as of the time this blog post was written). If it is a new calculation or set of tables that are being tested, collaboration will have to occur between the development team and the testers as to when to change the connection strings from DEV/TEST to PROD sources and republish the PBIX file.

In my experience, often on the reporting and analytics side of things, production connectivity is required by the “power users” in each department that are doing the quality assurance as they can only tell if new report features will pass the “sniff test” when they are connected to data that they know. DEV/TEST environments often grow stale and don’t hold as much value on the reporting/analytics side as they do for application development. Connection string management will be enhanced in Power BI as the product matures, but even then, limited value will be achieved without the production data to verify changes.

Lastly, when we look at the fact that a group workspace only has an “Edit” and “View” role and lacks a “Publish” role, it may be of concern that a content author can push changes for content packs without separation of duty (which may require someone else to actually publish the change). If this is very critical to your business, then the separation of duty can be achieved by adding another group workspace to the mix that only publishers can have access to, but I would suggest this is often overkill. Because reporting cannot “alter” any underlying data, it does not as often have as strict of controls in place to protect from malicious code being deployed such as a transactional application may have. And remember, there is a significant amount of any enterprise organization that is still running their business from Excel spreadsheets being produced and managed by the “power users” in each department. They simply extract data from source systems and create Excel reports that often go all the way to the CFO or CEO in the organization. So as we come back to the initial comment in this blog regarding the pendulum of self-service vs governance, remember that these concerns probably already exist in your organization today with much less visibility than they would when using the proposed workflow mentioned above.

 

Feel free to leave any comments on this article or suggestions for future articles that touch on the use of Microsoft BI solutions such as Power BI for enterprise deployment.