First look at Apache Superset from a PowerBI user perspective

Apache Superset is a very well known open source BI tool, I knew about it for the last couple of years, but was not thrilled by the idea of running a docker image on my laptop, recently though a new company Preset created by original author of the software start offering a cloud hosted service, basically they maintain the service and the consumer just use the software.

The pricing is very competitive, a free starter Plan with up to 5 users, then a pro plan for 20 $/Month/user ( the same as PowerBI PPU) and an Enterprise plan with custom pricing, kudos for the vendor for providing a free limited plan for users to learn and test.

The selling Point of Preset is very simple , it is open source software, so there is zero risk of vendor lock-in, if you are not happy with their service, you can always move your assets to another Provider or hosted yourself on-premise or in any cloud provider, this arguments is very attractive and make sense for a lot of customers , and companies like Red Hat have a good success using this business Model.

User interface

The user interface is simple and predictable which is a good thing, if you have used PowerBI or Data Studio you will feel at home.

Generally speaking there are two type of BI tools

  • Integrated : The BI tools provide, ETL, Storage, Modeling and the viz layer , as an example : PowerBI, Tableau, Qlik , Data Studio
  • Light : The BI tool scope is only Modeling and the viz layer, they need an External Database to host the data and run the SQL Query, Example Looker, Bipp etc ( I reference only the tool I personally used)

I guess the big assumption here, is that you have already a mature Data infrastructure which can be Queried from the same Data Base , Superset like Looker can not Query two DB in the same Dataset, The sweet spot for Superset is when all your data is hosted in a Cloud Data warehouse

Connecting to the Database

As usual I used BigQuery as my Data Source, the connection was pretty trivial to setup, although I could not figure out how to activate CSV upload

Building a Dataset

Dataset is the Data model it is the same concept as PowerBI with dimensions and Metrics (Measure), but it does not support relationships , you have to manually build a Query that join multiple tables or leverage a View in the Database , The metrics and dimension use SQL as calculation language.

One confusing aspect of the User interface; if you want to build a Dataset based on one table, you just click Data, Dataset then add Dataset, simple, but if you want to build it based on a SQL Query, you need to click on SQL Lab, then SQL Editor , you write the Query then save it as a dataset.

Example : Dataset based on A SQL Query and Parameter

I have a simple multi Fact semantic model in PowerBI, and usually used it as test case, 3 Facts with different grains and 3 dimension

I add this SQL Query to generate the Dataset, Parameter is used to pass date as a filter, yes I am biased, I used PowerBI for so long, that writing SQL Query to return Dataset seems strange.

{% set param_Date = filter_values('day')[0]  %}
SELECT category, commodity,  actualhours ,  budgethours ,  forecast
  SELECT category, commodity , SUM(actualhours) AS actualhours
  FROM `testing-bi-engine.starschema.actual`
   WHERE date <= date(TIMESTAMP_MILLIS({{cache_key_wrapper(param_Date)}}))
  GROUP BY category, commodity
) t2
  SELECT category, commodity , SUM(budgethours) AS budgethours
  FROM `testing-bi-engine.starschema.budget`
  GROUP BY category, commodity
) t3 USING(category, commodity)
  SELECT category, commodity , SUM(forecast) AS forecast
  FROM `testing-bi-engine.starschema.baseline`
  WHERE date <= date(TIMESTAMP_MILLIS({{cache_key_wrapper(param_Date)}}))
  GROUP BY category, commodity
) t4 USING(category, commodity)

The metrics

The Columns

I am using BigQuery BI engine, one fact is 43 Million rows and another is 1 Million, and although Preset Cloud is hosted in AWS even with network transfer the experience is very smooth, as of this writing Publish to web is not supported yet, so all I can show is this GIF

Chart library

Apache Superset has a very decent visual library, the first I checked is Pivot table and it is not bad at all , my understanding cross filtering will be supported only on E Charts Viz

Take Away

Superset is a mature open source data consumption layer with an enterprise support provided by Preset Cloud, if you have a mature data infrastructure and you know SQL very well, then it is worth a shot, I suspect it will be very popular with tech savvy companies and startups.

Edit : added a new post about Superset SQL Metrics

Calculate route distance, Tableau vs PowerBI

My previous  blog turn out to be very popular but unfortunately it is still not good enough, the Visual Interaction was just a hack by combining two Maps together with a big compromise the Zoom does not work and that’s very annoying ( Icon map is amazing, but still has to operate inside PowerBI framework)

The Second issue PowerBI is very chatty and insteading of sending 1 SQL Query by selection, it sends two SQL Queries, so selecting two points start and finish will generate 4 SQL Queries, usually I don’t care much, but in this case, it is relatively heavy calculation, and 1 extra Query can change the experience from this is a nice demo to I can do that at Work.

This behavior is hard to fix because it is basically how PowerBI was designed in the first place, the assumption was; you create a visual and the values change based on external filter selection or cross filtering from other visual, but there are a class of User interaction where the visual itself is the filter , a typical scenario is updating a visual title based on rows selected.

Instead of theoretical talks let’s rebuild the same visual in Tableau using parameter actions

Define a Parameter

Define SQL Query

      initial_parameter AS (
        UNNEST(split(<Parameters.TagsList> , ', ')) AS element ),
      mynetwork AS (
        ARRAY_AGG(geometry) roads   FROM   `test-187010.gis_free.brisbane_Road_Network_cluster` ),
      calculation AS(
        `libjs4us.routing.geojson_path_finder`(roads, st_geogfromtext(a.element), st_geogfromtext(b.element)) AS tt
        mynetwork,    initial_parameter a,    initial_parameter b
        result as (
       1 as dummy,tt.weight    AS len,tt.path AS GEO
    FROM  calculation),
    points_extract as (
    SELECT ST_PointN(geo, idx) mypoint,len, dummy,idx
  FROM result,
  unnest(GENERATE_ARRAY(1,ST_NUMPOINTS(geo))) idx  )
  select len,  st_y(mypoint) as lat, st_x(mypoint) as lng, cast (idx as string) idx ,1 as type from points_extract
  union all
  select null as len,st_y(center_location) as lat, st_x(center_location) as lng, st_astext(center_location) as idx, 2 as type  from test-187010.gis_free.Brisbane_Bus_Stop

Define Parameter Action

in Tableau you can define exactly what happen when you select point either it will filter or update a parameter in this case we want Tableau to add the value selected to the previous selection , as for calculating distance you need Start and finish point, PowerBI by default will do that using icon map, so far so good both Product have the same behavior

Auto Update Behavior

Unlike PowerBI, Tableau auto update the Map if the data change, and because I am using only 1 visual, the zoom works and the experience is more pleasant, Tableau send only 1 SQL Query by selection.

See here another example, when you click on a suburb, the Map Auto update and change colors relative to the suburb selected, very simple yet very powerful

And please I have zero interest in this useless debate who is a better BI tool, I just want PowerBI to implement a similar behavior or giving us an alternative, Maybe make a Visual accept multiple datasets.

As a user all I can do is to show my displeasure in public and hope other users will vote

Building a Metrics Store using PowerBI

was listening to this Excellent Podcast about Minerva, basically Airbnb went and built a Metrics Store which is a central tool that hold all Measures and dimensions, the idea is to have one source of truth.

The Podcast is very insightful and worth listening to, But I was very surprised that none of the participants mentioned how existent BI vendors tried to solve the same Problem ,they talked Quickly about LookML but as far as I know it is not available yet for third party tools

Metrics Store is not new, it is just another name for Semantic Model, Business Objects had one from 30 years ago.

So Just for fun, and I am not sure if it is even practical, I wanted to test if PowerBI can act like a pure Metrics Store for third party tools.

I am using BigQuery as a data source, 1 fact table 80 Million records and 5 dimension Tables all using Direct Query Mode, PowerBI is not holding any data, all it does , it receive Queries from the Viz tool in this case Excel and Tableau and translated it to SQL and serve the results back.

I publish the Model to PowerBI service, here is the catch it has to be Premium workspace, Premium per user works too

now Using Tableau, I connect to SSAS, I get the server address from Workspace setting, notice PowerBI and SSAS are using the same engine.

and Voila Tableau Indeed Work just fine

now trying Excel

and here the SQL Queries generated by PowerBI

Notice here Tableau/Excel talk to PBI Semantic Model using MDX, which you can see here using DAX Studio where I can capture all the Queries coming to PBI

The Performance is not great but tolerable I guess, MDX Queries render from 6 to 10 seconds, But it did work which is rather amazing by itself 🙂

Metrics Store is an interesting idea on paper, but not sure in practice, it is hard to justify paying for multiple tools, if you build your semantic Model in one tool, why pay for another BI tool ? The Other challenge is self service Scenario, the user needs to be able to add extra Measures and Dimensions.

and for new vendors, it will be helpful to explain what kind of problems they are trying to solve that the existing tools couldn’t, just creating a new fancy name is not good enough.

First impression of Looker from a PowerBI user

Looker was always a bit of mystery for me, as there was no trial version to test, and they have high profile customers, luckily Google recently start providing trial version without asking too much Questions, I registered 2 months ago, and recently got an email with a trial offer.

it is just an impression, I just used it a couple of times in the weekend with no training

You get an URL instance with the name of your company, I used a dummy name.

The user interface is rather streamlined, nothing unusual which a good thing , as expected it is hosted in Google Cloud, but you can’t see anything from the console ( Looker is available in AWS, Azure and GCP)

Looker by default has two modes, development and production, everything you build is linked to Git repository

let’s say you want to build a new Dashboard , first thing to do is to create a new LookML Project, setting up GitHub was straightforward, I made the repo public

Probably you are asking where is the data preparation the Likes of PowerQuery or Tableau Dataprep, there is none, Looker connect only to Databases, it assume you have already a Data Warehouse, Looker Provide Just the Semantic Model using LookML code and the Viz, and the API which you can call from any Clients.

There is no Import mode, all connections are live, there is a cache though that save the result caches which you can adjust by Project or you can turn it off by using cache = 0 second.

There are aggregation Table , which you can use to speed up Queries but again you need to store them in the source database.

Looker use an internal Database ( I think MySQL), but is only to store internal Metadata ( users names, SQL Queries etc), all calculation depends on the source system

To my surprise at least for simple Semantic Models, LookML is not too hard, I connected to a dataset in BigQuery and Looker imported the Table Schema, all I had to do is to define relationship between Tables, and add Measures using SQL

funnily enough the most confusing Part; Looker does not use standard technical Terms:

Views means Table

Explore is a semantic Model

Model is a group of Explores and Views

Look is a chart

at any time you can View the Diagram View

One interesting aspect of Looker Modeling compared to PowerBI, you can have Multiple Models that join the same tables in different ways !!! but at the same time, I could not reproduce a typical Multi Fact Tables easily (Example from PowerBI, I am not saying it is not possible, but I did not know how to do it)

Once the Model is completed you can start exploration using “Explore”, I guess this is the core Concept of Looker , every time you add a metric and dimension, Looker automatically generate the Correct SQL, and because the Model are defined Enterprise wide , the metric definitions and dimensions are consistent regardless of the users.

Notice here, the first thing you get is the data in a tabular Form, then you define the Visual, there is a saying that PowerBI is 80 % Data and 20 % Viz, I would say Looker is 99 % Data and 1 % Viz :), OK I am a bit exaggerating but don’t expect a Tableau Level Customization

When you are happy with your explore you can save it as a Look which is basically an individual Chart which you can pin to a Dashboard like this one

Another very nice functionality is Roles, in PowerBI you can customize what a user can do , build new content, sharing etc, But in Looker it is very granular, you can customize any option , you can even authorize who can and can’t download data !!

Sharing is very trivial, you can share with external users with different email address, even users with gmail and hotmail.

Tracking performance is simple, you can see the history of all SQL Queries and identify any issues

and a nice overall Performance of your Instance

Regarding Performance, I think Looker will perform Better compared to PowerBI in Direct Query mode , simply because there is less moving parts, in PowerBI, the viz generate a DAX Query against the Model which call the engine which call PowerQuery that convert to SQL, Looker Just Generate SQL Queries based on the Data Model that’s all

You can’t really Talk about Looker without talking pricing, here is a screenshot from Google Cloud Pricing


yes, there is no typo, 5000 $/Month get you 12 users and you need to pay 30 $/Month for any extra user !!!

apparently Looker is very popular with High scale embedded scenarios but it require some JavaScript skills which is out of my scope.

Take Away

as Far as I can see the target Market of Looker is Large Enterprises with Mature Data infrastructure and deep pockets, I can see the attraction of code based solutions with version control and centralized Semantic Models, no wonder BI professional with Software engineering background love it.

Apparently in the short term a lot of Data is moving to the cloud and Cloud Data warehouse are getting really Fast and cheap ( See BigQuery BI Engine), the concept of Live connection using a centralized Semantic Model is very attractive and according to some industry commentators it is the next big thing in Business intelligence.

Personally I am not sold, I think every Business users should be able to build his own analysis without waiting on a central BI team( not all reports are equals, some are strategic which require governance other less ), on the bright side LookML is not as hard as I thought at least for simple Models, but with the current Pricing, it is fair to say it is not a threat to PowerBI at least for smaller Installation

edit : got a feedback that a self service user can add custom measures and dimension that did not exist in the original Model , see documentation , but still if you want to use new data, it has to be imported first in the Source Database.

edit : Google shut down my trial instance after three weeks, unfortunately I can’t write any more about Looker, if someone from google is reading this , you’re not helping yourself.

%d bloggers like this: