Float Erosion report between two XER

I just added a new report in the Xer reader to show the float Erosion Report between two XER, I am using the “Harbour Pointe Assisted Living Center” from Primavera P6 default installation as example.

1- Download the pbix file here

2- Save a two XER in this folder C:\XER

2

3- Refresh PowerBI Desktop

3

4- Review the report, and export for Excel for sharing/Further analysis.

4

5- if you have PowerBI service in your company then publish it there, so other users can interact with the reports.

let me know if you want to see other reports.

 

 

Rewriting Xer Reader using PowerBI

4 years ago, we had to customize the xer parser for a client, that work end up as Xer Reader, turn out the file was very popular, but unfortunately, as it was written in VBA, users get all kind of different errors, and I was not able to provide a proper support.

now I am trying to rewrite the same  Excel Macro but now using PowerBI, the file is hosted in github

in order to open a pbix file, you need PowerBI Desktop, it is a free download

I think it will be an interesting experience, some stuff are much easier using VBA, and there is no equivalent in PowerBI, I will try to document the progress.

in the main time here is my progress so far

stay tuned

Using QGIS to prepare Custom Maps in PowerBI

This blog post is to document my first experience with mapping in PowerBI, usually the reports we produce are time series and pivot tables, but in this case, there is a geographic dimension to the data, so I thought it is a good opportunity to try map chart in PowerBI, it turn out, it is a bit difficult than I thought.

So the data is in the thousand of piles, a lot of piles in a huge area, my first attempt was just to load the data in PowerBI and view it in a map, for some reason, PowerBI show an empty map.

PowerBI expect the data to be in latitude and longitude format, my the data is using easting and northing, I had to convert it, there are plenty of online converter, but there are not practical, as they don’t support batch processing, converting point by point is not an option.

After some googling, I find this excellent open source software QGIS, it was very straightforward, the software automatically convert the coordinates reference system (I think the technical term is reprojection), my data is GDA94 / MGA zone 55 and the result should be in WGS 84.

Voila the data is ready for PowerBI,

map-1

That’s a bit disappointing, PowerBI complain it cannot show all the points ( PowerBI has a maximum of 3500 points per chart).As a comparison, this is how Tableau show the data

tableau

Tableau doesn’t have a limitation on the number of points.

 

Alternative approach

 

As PowerBI cannot show all the point, one solution is to create a shape file that group the points into smaller areas, and again, it was trivial to be done in QGIS.

QGIS will group the point based on a filed you provide.

Group

 

sub-array

And volia

qgis

 

QGIS save the layer in ESRI shapefile, PowerBI require TopoJSON, I used the excellent tool mapshaper.org

Just make sure you import all the files not only .shp

export

Now the TopoJSON is ready to be loaded in PowerBI

topojson

 

Time to celebrate 🙂  not really there is a problem, I want to show different colour based on the status, if a sub-array is completed, I want it to show Yellow, if it is > 75 %, I want another colour and so on, the shape Map in PowerBI does not offer this option.

Fortunately Synoptic Panel   has more options, it use SVG as a map format, which I got from mapshaper.org

filled map

 

Preparing the custom map is a one time operation,  the color will change as per the construction progress, you need just to assign which measures you want to show

measures

What’s Next

As long as your data model is properly built,  you can do a lot of interesting stuff, you can filter by type of works (piling, Tracker installation, PV Modules), you can click on one sub-array and see in details what’s have been installed and what’s missing.

 

 

Build a Dashboard using Data Studio

Google Data Studio is a new reporting service from google, it is still in beta, has rudimentary capabilities compared to established players ( PowerBI, Tableau and Qlik) and can be very slow, yet it has two killer features, it is free, with amazing sharing options.

Google business model for this product is very simple, the visualisation layer is free, but the preferred data source is BigQuery which is a paid product, but other source usage are free.

In the last two months, I have been experimenting with the product, I never managed to make it produces any meaningful charts, that was my mistake as I was thinking from a semantic model background, which is not valid here.

Data studio do not support multiple datasets, you can use multiple datasets in the same reports but not in the same chart, but you can filter multiple datasets using a common dimension, when solution is to create a master datasets that combine all the tables, if is the same concatenate table used in Qlik.

Now if you have a data workhouse you could simply write an SQL query to combined all the dimensions you want to report on, I don’t, so I will use what I have PowerBI desktop !! You can use R studio too, or any data management solution you are familiar with.

So the approach is build a data model in PowerBI desktop,  Query the model using Dax and export the result to CSV,  then load the CSV to Data Studio and share the report, Data Studio is only used as reporting service.

1- I will use this datamodel.

Data_Studio_1

2- And use this query to extract the dimensions and measures I want to report on


EVALUATE
VAR Earned_Table =
FILTER (
ADDCOLUMNS (
SUMMARIZECOLUMNS (
TBPAYCODE[Commodity_ID],
TBCommodity[Commodity],
TBWBS[WBS_L3],
TBDate[Project Cut Off],
"Budget", tb_Measures[budget_hours],
"Period", tb_Measures[Earned_Hours_Period],
"Cumulative", tb_Measures[Earned_Hours_Cumulative]
),
"TYPE", "Earned"
),
MIN ( TBDate[date] ) <= [Cut off] && MIN ( TBDate[date] ) >= DATE ( 2012, 1, 1 )
)
VAR Plan_Table =
FILTER (
ADDCOLUMNS (
SUMMARIZECOLUMNS (
TBPAYCODE[Commodity_ID],
TBCommodity[Commodity],
TBWBS[WBS_L3],
TBDate[Project Cut Off],
"Budget Hours", tb_Measures[budget_hours],
"Planned Hours", tb_Measures[planned_Hours],
"planned Hours Cumulative", tb_Measures[planned_Hours_Cum]
),
"TYPE", "Planned"
),
MIN ( TBDate[date] ) >= DATE ( 2012, 1, 1 )
)
VAR Spent_Table =
SELECTCOLUMNS (
FILTER (
SUMMARIZECOLUMNS (
//TBPAYCODE[Commodity_ID],
//TBCommodity[Commodity],
TBWBS[WBS_L3],
TBDate[Project Cut Off],
"Budget", tb_Measures[budget_hours],
"Period", tb_Measures[spent_Hours_Period],
"Cumulative", tb_Measures[spent_Hours_Cum]
),
MIN ( TBDate[date] ) <= [Cut off] && MIN ( TBDate[date] ) >= DATE ( 2012, 1, 1 )
),
"Commodity_ID", BLANK (),
"Commodity", BLANK (),
"wbs_l3", [WBS_L3],
"TBDate", [Project Cut Off],
"Budget Hours", [budget_hours],
"Planned Hours", [planned_Hours],
"planned Hours Cumulative", [planned_Hours_Cum],
"Table", "Spent"
)
RETURN
ADDCOLUMNS (
UNION ( Earned_Table, plan_table, spent_table ),
"Data_Date", IF ( [Project Cut Off] = [Cut off], 1, 0 )
)

Basically I am using union to concatenate the values for Earned Hours, Planned Hours and spent Hours, I added a column type to distinguish between the different tables and in the case of spent, I assign the value blank for Commodity_ID and Commodity as the measures don’t have the same granularity.

I added a column ‘Data_Date” so I can filter only the dates where it is equal to the cut off.

3-Load the CSV

Data_Studio_2

If a field aggregation is none, then it is a dimension, otherwise it is a metric (measure), now, there are some functions to do calculations like sum, average but it is not DAX, it is really limited.

As an example for the measure Planned Hours = SUM(CASE WHEN REGEXP_MATCH(TYPE, ‘^Planned.*’) THEN Period ELSE 0 END)

4-build the report

Data_Studio_3

You can view the report here , and the datasets here , (sorry does not support internet Explorer, Only Chrome, Firefox, and Safari,  Internet Explorer works now, although not officially supported)

Personally the most interesting aspect of Data Studio is the integration with Google Drive, the reports you build will show in your Google drive and you can organize them in different folders, for example then you can group reports by departments, shop etc

Data_Studio_4

I think it has potential, they got the sharing right, now they need to add some basic features ( pivot table, controlled cache ….) , and more options in the calculated field formula, ( hint return null for example is not supported,  ok there is a workaround, Just create Metric*Metric/Metric, if it is 0 then it is an error, GDS return null in this case) and you will have data visualisation for the masses.