Connecting PowerBI to Primavera Database, Part 1

I think one of the most asked question when some talk about Primavera and PowerBI, is how to connect to the database, ok, the good news is, the connection itself is easy, the bad news, extracting useful information is a bit of work.

Just to show how it work, I am using a temporary installation in my personal laptop, as obviously I don’t have access to my production database.

I am using a developer edition of SQL Server 2006, and an evaluation copy of EPPM, oracle allow the use the evaluation of most of its software for the first 45 days, you can download a copy from here, you need SSMS too

For the purpose of this blog, we will query the “normal” Primvera tables, for the extended schema, which is a groups of tables and  views design specifically for reporting, but those extra tables are empty per default and you need to configure publishing service ( will discuss it in a future blog), please note I already blogged about how to connect when using Sqlite in the case of standalone P6 professional

Connect to SQL server using SSMS

When you install Primavera, you get to define 4 user account

  • sa : the database admin account (not the admin for primavera application).
  • Privuser, pubuser : used to connect Primavera app to the database
  • Pxrptuser : user account for reporting

              We will use sa to connect to the database            

When you click on connect you get this

The database itself has 320 tables; you can check that by running this SQL script

USE PMDB

GO

SELECT *

FROM sys.Tables

GO

Create a read only user

Connecting using the admin account is just very bad practise, and I don’t want to mess with the existing account, so instead we will create a read only user account

  1. Create a New Login
  • Create password
  • Map the user the PMDB
  • Assign a new role

Instead of having access to the 320 tables, we create a new role (read_only) and we just assign the 3 most important table in the database, you can add later more tables, we granted select only, so no read access

Connect PowerBI to SQL using read only user

and Voila our Tables are now visible in PowerBI

so the answer to how to connect to Primavera Database from PowerBI is you need a user name, password and the server name, the challenge is how to extract meaningful reports from those tables ?

what’s next

at this stage, you need to get yourself familiar with Primavera Schema, yes it is 320 tables, but the basic one are three, and usually for my reporting I use around 10, I wrote an introduction to Primavera schema 6 years ago, I hope it is still relevant

Part 2 is published here

For security implication please read this

Create a PowerBI Resource Analysis Dashboard – by Darrin Kinney

Feel free to provide comments directly to my LinkedIn Post that references this article and contact me directly – Darrin Kinney

Analyzing  resources in P6 is a common responsibility of all planners. However, ultimately the data typically stays in the realm of the planner, and never properly given to the specific project managers and project engineers who actually need to execute the work. Thus, there exist a huge opportunity in the digital world to extract data from P6, and present it to the masses. Planners are penultimate professionals in this. We live for this. We take the resource assignment data from P6 and work our magic in excel creating a suite of reports and graphs (s-curves).

However, all the excel work is customized. The ability to quickly drill into the data from the data table or the graph is just too difficult for management. The need to create custom graphs, takes time. Instead, we can create something quite amazing using PowerBI.

Using PowerBI we can create a simply easy to use dashboard that provide nearly unlimited flexibility to display both schedule and resource information to any user on a project. The below is a walk through to create a simple view with a simple schedule. However, I have run this through a schedule with 5,000 activities and multiple resources including both manhour resources and quantity resources. It is situations where you have data overload that PowerBI shines. Get the data out from the planner, and into the hands of the project management group.

Step 1 – Get resource assignment data

Using the resource assignment tab in P6, remove all the grouping and just display everything. You will want to ensure some key fields are available: Start Data, End Data, Resource Data, Resource ID and Type, Activity Name and ID, plus some WBS and grouping data (ex contract).

RA_2_P6_resources

Step 2 – Copy-Paste Resource Assignment Data into Excel

A simply copy-paste from P6 into excel will suffice. It is really that easy. Sometimes the data from P6 may not contain the right descriptions, so this is an opportunity to use some vlookups (or Index(Match()). The data also needs to be presented in a table. Select All and use CNTL-T to convert to a table.

RA_3_excel_table

Step 3 – Unpivot Data using Power Query

This step can be done directly inside PowerBI Desktop, or in perhaps a more flexible excel environment. I prefer to do as much data handling in excel to reduce the complications once inside PowerBI; however, many options exist.

Inside Excel (ensure you have power query add in), select the columns with the dates, and click “UnPinvot Columns”. Close and Exit.

After running the unpivot, you will see each data column turned into a row. This is an easier data format for use in databases and a trick universal not just for powerBI, but anytime you might be dealing with databases.

RA_5_powerq

Step 4 – Import Into PowerBI

PowerBI has very quick and easy import routines. This post is not meant to be a specific click by click guide. There are easy import routines you can find to import excel files. Again, as indicated above, just make sure your data is converted into a table. You can directly create a datasource using something similar to the below

=Excel.Workbook(File.Contents(“C:\Users\Name\Downloads\P6_resource_Assignments_HP.xlsx”), null, true)

Step 5 – Create your PowerBI Elements : Gantt, Graph, and Slicers

GANTT

The PowerBI file uses a custom visual you can download for free from the marketplace

GANTT by MAQ Software

The parameters you will want to use are roughly as noted below

GRAPH

The PowerBI file use AREA CHART. This is an easy to use built in graph feature of PowerBI. You can use potential extension of line graphs to show a % Complete curve. However, from a base usage, the area chart provides the easiest visual.

A nice trick in the overall dashboard is to align the data range of the GANTT with the GRAPH. I have found some limitations to this exist, but in this example, it worked quite nicely.

 

SLICER

The real power of this visual is through the use of slicers. Slicers allow for immediate filtering based on a variety of selections. This visuals uses a range of filters for Contract, Facility, SubFacility and ResourceID. Depending on the structure and data you extracted from the P6 resource assignment, your options here are unlimited.

 

Conclusion

This example just scratches the surface of what is possible. I found the above to be immediately useful to our project team in clearly visualizing the resources required over time. There is some manual effort to keep the source excel updated after changed to P6 – this is not a live linked datasource. There are numerous possible development extension people can use to customize this.

The biggest critique to the dashboard is the lack of visibility into the Baseline dates and plan curves. For this, we can only hope someone builds a more robust custom visual to accommodate this. In the end, happy planning.

RA_1_Final

Using QGIS to prepare Custom Maps in PowerBI

This blog post is to document my first experience with mapping in PowerBI, usually the reports we produce are time series and pivot tables, but in this case, there is a geographic dimension to the data, so I thought it is a good opportunity to try map chart in PowerBI, it turn out, it is a bit difficult than I thought.

So the data is in the thousand of piles, a lot of piles in a huge area, my first attempt was just to load the data in PowerBI and view it in a map, for some reason, PowerBI show an empty map.

PowerBI expect the data to be in latitude and longitude format, my the data is using easting and northing, I had to convert it, there are plenty of online converter, but there are not practical, as they don’t support batch processing, converting point by point is not an option.

After some googling, I find this excellent open source software QGIS, it was very straightforward, the software automatically convert the coordinates reference system (I think the technical term is reprojection), my data is GDA94 / MGA zone 55 and the result should be in WGS 84.

Voila the data is ready for PowerBI,

map-1

That’s a bit disappointing, PowerBI complain it cannot show all the points ( PowerBI has a maximum of 3500 points per chart).As a comparison, this is how Tableau show the data

tableau

Tableau doesn’t have a limitation on the number of points.

 

Alternative approach

 

As PowerBI cannot show all the point, one solution is to create a shape file that group the points into smaller areas, and again, it was trivial to be done in QGIS.

QGIS will group the point based on a filed you provide.

Group

 

sub-array

And volia

qgis

 

QGIS save the layer in ESRI shapefile, PowerBI require TopoJSON, I used the excellent tool mapshaper.org

Just make sure you import all the files not only .shp

export

Now the TopoJSON is ready to be loaded in PowerBI

topojson

 

Time to celebrate 🙂  not really there is a problem, I want to show different colour based on the status, if a sub-array is completed, I want it to show Yellow, if it is > 75 %, I want another colour and so on, the shape Map in PowerBI does not offer this option.

Fortunately Synoptic Panel   has more options, it use SVG as a map format, which I got from mapshaper.org

filled map

 

Preparing the custom map is a one time operation,  the color will change as per the construction progress, you need just to assign which measures you want to show

measures

What’s Next

As long as your data model is properly built,  you can do a lot of interesting stuff, you can filter by type of works (piling, Tracker installation, PV Modules), you can click on one sub-array and see in details what’s have been installed and what’s missing.

 

 

Integrated Project Control system using PowerBI

One of the most popular discussion in planning forum is how to have an integrated project control system, every practitioner has a different opinion how it must be done, and of course you get a lot of marketing from people trying to sells their systems

In this blog, we share high level description of a data warehouse built using PowerBI desktop.

Data warehouse are not widespread in the construction industry, because the reporting specifications are different from project to project, and every client has a different systems and tools, and trying to have a standard system works only if you are the main contractor but if you are a subcontractor you have to adopt the client system.

Another reason is; it required a specialised IT skills, we are just business users not programmers, we do understand data very well, but not necessarily having the skills and tools to manage it, The good news is, with the rise of self-service Business intelligence, we have exactly that, Powerful data management tools yet accessible (assuming you want to learn something else than Excel).

So at high level this is how it works.

 Project Controls Data Warehouse

  • The Data warehouse was built using PowerBI desktop, I know it should be called sematic model, (for me data model, data warehouse are fundamentally the same thing), initially it was using Excel PowerPivot but it did not scale well with the increase of the volume of data.
  • As the data is not always in the format we want, PowerQuery is very handy in this case, as virtually it can transform any source of data, example lookup the subsystem using the tag field, trying to do that if you have 8 Million rows using Excel or Access is not feasible.
  • We maintain Master tables to integrate all the different source of data (tags, WBS, subsystems etc)
  • Every week, we get new Export from the source systems (Cobra, proprietary database systems etc), we load the new data and keep the historical records, it took 15 minute to refresh, which is quite impressive, Cobra alone is a folder 60 Excel file, and nearly 2 Giga in size.
  • Usually you publish your reports into PowerBI.com service to end users once your refresh your data model, in our case we can’t use the cloud for privacy reason, instead we use Excel as a reporting tool that pull the data from PowerBI desktop, the advantage of this approach is that we have different reports for different users, Skyline, Gantt chart, Client reports (in their required format), management reports etc.

Some thoughts.

  • As you can see Primavera P6 is used only as a forecasting tool and to calculate the critical path, the earned value calculation is done in the data model, personally I think P6 should not be used as the centre of your project control system, I remember the first time I start learning Primavera P3 ( a long time ago:), we kept asking the trainer how it is possible to track the spent hours at the activity level, the answer is we don’t, actually deciding at which level you track you spent hours it is the most important decision to make when you start a new project.
  • the basic idea here is in order to have an integrated project controls system is stop trying to have one, data will be always in silo, don’t try to change other department how they manage their specific data, it will not work and they will not listen to you anyway, So instead of trying to have one system to rule them all, just use the existing systems and build a data warehouse for reporting and Integration with P6.

How to use Excel as a Front End to Power BI Desktop

You can download the data model, the excel frond ends and the data sets in this folder.

The techniques used in the spreadsheet for connecting to PowerBI Desktop are not officially supported by Microsoft,  ( as of  24 may 2018, Microsoft don’t mind people using this approach ) but I think are useful to understand how the MS BI stack works, especially if you are coming from an Excel Background, and Installing SQL server developer is either intimidating or not allowed.

I really appreciate if you vote for this idea to make this scenario offically supported by Microsoft

As far I am aware there are two approach to access PowerBI desktop, or more correctly the SSAS instance launched by Power BI Desktop.

  • Connecting using a live connection, detailed here
  • Connecting using PowerQuery, detailed here

 

The live connection is very interesting but it has the drawback that if you close PowerBI Desktop the pivot table stop working, so you can not share the spreadsheet, one solution is to use cube formula as they are persistent, if you don’t use know what a cube formula is then you are missing of the most powerful feature of Excel.

But what if you want to have a pivot chart, or a pivot table that you can keep using even if you lose the connection, or if you want to share the results with people that have not access to the data model,  turn out it is possible, welcome to Excel Pivot cache

 

  • Invoke the function SSAS_QUERY
  • 1
  • The Parameter is optional:
    • Either you write a DAX Query to retrieve the fields you want, MDX is supported too, personally I find MDX more suitable to import measures with different dimensions, I understand that DAX support this scenario too using crossjoin but I never manage to make it works correctly.
    • or Just click ok, than you can browse the SSAS cube, you can select any dimensions and measures you want, but mind that for a big cube, a query fetch the result faster.
  • Keep Powerquery as a connection only.
  • Insert a pivot table, use an external data source, choose connection, select the Powerquery Query, and voila

2

 

 

Excel cache the data in the Pivot, not only that it is extremely compressed.

Notice here PowerPivot is not used at all, Excel is acting as visualisation layer to PowerBI Desktop, leveraging two well known capabilities cube formula and Pivot cache.

Microsoft plan to release SQL server v.next this year, and then we can deploy the data model built with Power BI /Powerpivot /Powerquery into a production system, that’s what I call a natural growing path from self service to corporate BI.

Edit Dec-17 : i add a new file that use only VBA, so Powerquery is not required, it can be useful if you are still on Excel 2007, or you can’t install Powerquery.