Share Data from a PowerBI Dataset using Dataflow

Using PowerBI Dataflow to share data between departments is increasingly getting more popular, with the rise of managed self service offering, it is not unusual to find a non centralised Data warehouse but rather every department managing it is own data, I don’t know enough to say anything useful on which model works better in the long term.

this blog is about one use case I encountered recently, and thought it maybe worth sharing and to be very clear it is always good practise to share the data from the upstream Data source , but in this case, there was a non trivial business logic applied inside PowerBI dataset.

Usually we have two options.

  • A user can just create a new thin report assuming he has build permission
  • With Direct Query on composite model, a user can mix between live connection to a dataset and import different sources.

Both approach are fine, but in this case, the data consumer from another departement is just interested in 1 table (a couple of dimension and some measures) and because there is a non trivial logic in the Model, we can’t get the data from the source system, and it is rather important not to provide Build permission.

In my humble opinion sometimes giving access to a whole Data model with a lot of tables, relationship etc does seems an overkill and potentially confusing , instead I am suggesting an easier approach, sharing a dataflow that run a DAX Query against the model, and return exactly the agreed schema,basically a flat table which will not change even if the structure of Model change.

it is very straightforward and works with both Pro and premium ( Premium works better with large models, the timeout duration is more generous, and anyway PPU is very cheap)

Build a visual that show the required data

I will use a random dataset as an example, let’s say the data consumer wants this data updated daily ( real life example is substantially more complex)

click on performance analyzer , refresh visual then copy the Query

Clean the Query in DAX Studio

Paste the Query in DAX studio, don’t get intimidated by the apparent complexity , The main Query start by VAR __DS0Core

Now in a simplified format

// DAX Query

  EVALUATE 

    SUMMARIZECOLUMNS(
      'MstDate'[SETTLEMENTDATE],
      'REGION'[REGION],
      TREATAS({"QLD"}, 'REGION'[REGION]),
      TREATAS({"5 Minutes"}, 'unit'[description]),
      "average_Price", '00-Measures'[average_Price],
      "INITIALMW_Sum", '00-Measures'[INITIALMW_Sum]
    )

Alternatively you can build The Query straight in DAX Studio using the Query Builder

Connect to PowerBI service endpoint using PowerBI Desktop

you can read the documentation here, basically you connect to SQL Server Analysis Services which is the Engine of PowerBI,

Note : I am using my personal tenant.

Copy PowerQuery code to Dataflow

Dataflow don’t have a connector to Analysis Services at least in the UI, but coping the Query just works ( notice the double Quotes in QLD)

let
    Query1 = AnalysisServices.Database("powerbi://api.powerbi.com/v1.0/myorg/aemo_data", "aemo_data",
     [Query=
    "
    // DAX Query

  EVALUATE 

    SUMMARIZECOLUMNS(
      'MstDate'[SETTLEMENTDATE],
      'REGION'[REGION],
      TREATAS({""QLD""}, 'REGION'[REGION]),
      TREATAS({""5 Minutes""}, 'unit'[description]),
      ""average_Price"", '00-Measures'[average_Price],
      ""INITIALMW_Sum"", '00-Measures'[INITIALMW_Sum]
    )
    "
    ,Implementation="2.0"])
in
    Query1

I am not suggesting it is a good practise, in an ideal world everything is in a nice Cloud DWH with business logic and all, and the BI tool is just a front end to run aggregation and show nice charts, but real life is not perfect and it is useful to have the option when you need it.

Edit : The original title of this blog was using Dataflow as Poor Man’s Data mesh, but I had to be realistic as of this writing Dataflow can only be consumed by a PowerBI Client, not even Excel is supported and I know about Azure Storage, but with the current paranoia about security, it is extremely hard to have that option on for a lot of companies , Hopefully Dataflow product team open up a little bit for other Client too.

Filtering by Dynamic Distance in PowerBI

TL:DR , the blog show some tricks using DAX and Icon Map, pattern like dynamic filtering by distance and showing a Buffer when selecting a point, The Publish to web report and pbix are here

before you get too excited Currently to make it works you have to block the Zoom Functionality.

The Formula for calculating distance in PowerBI is very well known I think it was first mentioned here , I had used it already for building this report a couple of years ago.

what changed is Icon Map, with the 3.0 release basically nearly everything can be customized which make some new scenario Possible.

you need only 1 table with coordinates

Draw a Line Between Two Points and show Distance

1- You Build first an Icon map just using point

2-Create the second Layer by creating a new Map Visual

This Map will show the line based on the Point selected

we need first a dummy table with one row

Build a measure that capture the selection

linestring = var Geo="LINESTRING("&CONCATENATEX(reference,[longitude]&" "&[latitude],",")&")"
return
if(ISFILTERED(reference[city]),Geo,BLANK())

add the Line String Measure here

and Voila now you have two Map, The Point Selected in the first Map will be shown as a line in the second Map, CONCATENATEX does not respect the Order of Selection, so currently it is useful only for two Points ( M parameter respect the selection but that will be another Blog)

3- Merge the Two Map together

I guess this is the trick of the Blog, if you want this behavior

Stack the two layers on top of each other; Make the Point layer background transparent, Remove Background, Lock the Map, keep both layer having the same Dimension , Layer Layout and zoom Value

You have to remove the Zoom otherwise the two layers will not be synchronized, Technically speaking we need only 1 Map, but unfortunately DAX table can not filter itself, feel free to vote on this idea and complain on Linkedin and Twitter 🙂

Select a Buffer Selection

When you Select a Point, the Map will draw a Circle and filter only the Point inside the Circle, it is a very Powerful GIS pattern.

The Same approach, but now we create three Layers

The Selection Layer we use this option ( the point not selected will be hidden, I love Icon map )

The Circle Layer

we use this Option

to get the Radius of the Circle we use this Measure

diam_M = if(ISFILTERED(reference[city]),SELECTEDVALUE(Dis_Parameter[Dis])*1000)

The Parameter is used to control the Size and the measure Works only when a point is selected

The Calculation Layer is where we show the Point filtered , we will use a bit of DAX, we create a new disconnected Table , make sure there is no relationship between the two Tables

to reduce data duplication we need only ID from the first table, I am using a measure to get the coordinates ( Probably LookupValue is no the most performant Option)

Target_Lat = LOOKUPVALUE(reference[latitude],reference[city],min(Target[Target City]))
Target_Lng = LOOKUPVALUE(reference[longitude],reference[city],min(Target[Target City]))

The Measure to calculate the Distance is

Target Distance = 
-- see https://radacad.com/dynamic-distances-in-power-bi
var Lat1 = min(reference[latitude])
var Lng1 = min(reference[longitude])

var Lat2 = [Target_Lat]
var Lng2 = [Target_Lng]
---- Algorithm here -----
var P = DIVIDE( PI(), 180 )
var A = 0.5 - COS((Lat2-Lat1) * p)/2 + 
    COS(Lat1 * p) * COS(lat2 * P) * (1-COS((Lng2- Lng1) * p))/2
var final = 12742 * ASIN((SQRT(A)))
return if(ISFILTERED(reference[city]),final)

And then another Measure to filter the Result

Filter_by_Distance = if(ISFILTERED(reference[city]),if([Target Distance]<=SELECTEDVALUE(Dis_Parameter[Dis]),1,BLANK()),BLANK()) 

Then add the measure filter to the Viz

Icon Map is a fantastic Tool, and there are plenty of possibilities, hopefully the author keep adding more options and customisation

Building a Metrics Store using PowerBI

was listening to this Excellent Podcast about Minerva, basically Airbnb went and built a Metrics Store which is a central tool that hold all Measures and dimensions, the idea is to have one source of truth.

The Podcast is very insightful and worth listening to, But I was very surprised that none of the participants mentioned how existent BI vendors tried to solve the same Problem ,they talked Quickly about LookML but as far as I know it is not available yet for third party tools

Metrics Store is not new, it is just another name for Semantic Model, Business Objects had one from 30 years ago.

So Just for fun, and I am not sure if it is even practical, I wanted to test if PowerBI can act like a pure Metrics Store for third party tools.

I am using BigQuery as a data source, 1 fact table 80 Million records and 5 dimension Tables all using Direct Query Mode, PowerBI is not holding any data, all it does , it receive Queries from the Viz tool in this case Excel and Tableau and translated it to SQL and serve the results back.

I publish the Model to PowerBI service, here is the catch it has to be Premium workspace, Premium per user works too

now Using Tableau, I connect to SSAS, I get the server address from Workspace setting, notice PowerBI and SSAS are using the same engine.

and Voila Tableau Indeed Work just fine

now trying Excel

and here the SQL Queries generated by PowerBI

Notice here Tableau/Excel talk to PBI Semantic Model using MDX, which you can see here using DAX Studio where I can capture all the Queries coming to PBI

The Performance is not great but tolerable I guess, MDX Queries render from 6 to 10 seconds, But it did work which is rather amazing by itself 🙂

Metrics Store is an interesting idea on paper, but not sure in practice, it is hard to justify paying for multiple tools, if you build your semantic Model in one tool, why pay for another BI tool ? The Other challenge is self service Scenario, the user needs to be able to add extra Measures and Dimensions.

and for new vendors, it will be helpful to explain what kind of problems they are trying to solve that the existing tools couldn’t, just creating a new fancy name is not good enough.

First Look at BigQuery BI Engine with PowerBI

Google made BigQuery BI engine available in a public preview , you need to enroll first here, for the last two years it was available only for Google Data Studio, and I had use it extensively for this Project, so I was really curious how it will work with PowerBI.

I don’t think I know enough to even try to reproduce a benchmark, Instead I am interested in only one Question, how much value I can get using the lowest tier of BI Engine and can PowerBI works smooth enough t make Direct Query a realistic option.

BigQuery team was nice enough for the preview period to have 100 GB reservation free of charge, just to keep it realistic, I kept reservation to 1 GB with a cost of $30 per month, I built a couple of reports in PowerBI and tried to observe how BI engine behave and observe the Query statistic , The report is located here, The Data is using Direct Query Mode, the Query statistics update every 1 hour.

How BI Engine Works

it is extremely easy to setup just select how much memory you want to reserve by Project, and that’s all, you pay by GB reserved per hour.

Keep in mind the Project used for reservation can Query any other projects as long as it is in the same region, in PowerBI, you can define which project you use for the Query

After you wait a couple of minutes for BI engine to start, this is more or less how it works

1-Query received by BigQuery, based on columns used in the Query, BI engine will load only those columns and partition into the Memory, the First Query will be always slower as it has to scan the data from BigQuery Storage and compress it in memory in my case usually between 1-2 second

2-Second Query arrive, the data is already in Memory, very Fast 100 ms to 500 ms

3- Same Query arrive as 2, BigQuery will just hit the cache, that’s the sweet spot, less than 100 ms

4- A new Query arrive that target different table, that’s the interesting part, BI Engine based on the size of the scanned column, and the available reservation, either evict the old table from memory or decide that there is not enough Reservation then it will fall back to the default BI Engine, where you pay how much data is scanned

5- A Query arrive that contains feature not supported by BI Engine , it will fall back to the default engine

6- Data appended to the Base table or Table changed, BI Engine will invalidate the cache , it will load the delta to memory or load everything again if the table was truncated

Obviously it is much more complex behind the scene, But I find it fascinating that BI engine in a fraction of a second decide what’s the best way to serve the Query ( cache, Memory or Storage)

Personally I am very interested in Case 4, obviously if I reserve a Memory I want to minimize scanned storage to the lowest possible, here is the result for the last 10 days, I think that’s a great result, my ” Big Table is 6 GB, 50 Million rows” other tables are smaller , the dev team said they are working on improving even more how BI Engine algorithm deals with smaller tables, so far happy with that. ( it is fixed now, the memory consumption is extremely low now )

I appreciate other users with flat rate pricing would not care that much about file scanned , for user with usage based pricing, it is a very important factor

Query Performance

Again the results is based on my usage, the only way to know is to try it yourself, nearly 72 % of Queries render in less than 100 ms, I think it is fair to say, we are into a different kind of data warehouse architecture

PowerBI Performance

The Query Performance is only a portion of the whole story, you need to add network latency and PowerBI overhead ( DAX calculation, Viz rendering etc), my data is in Tokyo Region and PowerBI Service is located in Melbourne, a bit of distance I would say and using Publish to web add an extra latency.

The good thing, using Direct Query on a 51 Million Fact table with 5 dimensions is an achievement in itself, I feel I can use that in a Production, at the same time, using other report, it seems I am hitting a bug in the ODBC driver, and the performance is not good.

but to be totally Honest, it seems PowerBI driver for BigQuery is far from being optimized, it seems they are using SIMBA ODBC , other BI tools are using the native API and it is substantially faster, but I have reason to believe the PowerBI team will invest more in better Integration ( PowerBI parameter in SQL Query is coming for example)

I Think it is extremely interesting new development, specially if you have Big Fact tables or data that change very frequently, Direct Query mode have a lot of advantages, it is very simple to setup, the data is always fresh and there is no data movement. and BI Engine is fast, extremely Fast, and Cheap !!!, I am using a state of the art data warehouse for $30 per month !!!!, now it is up to The PowerBI Team to take advantage of that.