## Analyzing GIS data using BigQuery and PowerBI

TLDR, world data here , pbix file (Publish to web has a limit of 1 GB, only points are used)

Australia Report with polygons , pbix file

Australia report Using Datastudio Google Map

Edit : 14 April 2020, Updated the report to load all the tags amenity in the world, I am using this formula to dynamically calculate the distance between two points

Due to the COVID19 pandemic Google has made some public dataset free to query, one of them is openstreetmap, I thought it is an excellent opportunity to play with BigQuery GIS functions.

Using the existing documentation, I come up with this Query which return all the geometries in a radius of 100 Km from an arbitrary point ( for some reason I choose Microsoft office building in Brisbane as a reference) and with a tag =amenity

`WITHparams AS (SELECTST_GeogPoint(153.020749,-27.467539) AS center,100000 AS maxdist_m )SELECTar.key,ar.value,feature_type,osm_id,osm_way_id,geometry,ST_CENTROID(geometry) AS center_location,ST_Distance(ST_CENTROID(geometry),params.center)/1000 AS distanceFROMbigquery-public-data.geo_openstreetmap.planet_features,params,UNNEST(all_tags) AS arWHERE('amenity') IN (SELECT(key)FROMUNNEST(all_tags))AND ST_DWithin(ST_CENTROID(geometry),params.center,params.maxdist_m)`

the query return

## WARNING

the query processed 245 GB in 16 seconds !!!, and it did cost 0 \$ at least till 14 Sept 2020, after that it will incur cost ( 1 TB/5 \$)

you can explore the result using the built in Geoviz, but you can’t share the data.

PowerBI does not support custom queries when connecting to Bigquery , I had to save the query results in a view, then the connection to PowerBI is straightforward.

the query results is returned as a Key, Value

using PowerQuery pivot, it is trivial to denormalize the table ( I could not find how to do that in SQL), anyway the results looks much easier to analyze.

by the way just be careful , PowerBI support a maximum of  32766 characters , but there is an easy workaround, split the column by 32766 and then concatenate in a calculated column, yes it will increase the memory size, but it works.

and here is the final results using the beta version of icon Map, for example filtering all the data less than 4 Km, if you want print quality map you can always use R visual, see example here

the custom visual is still in beta, polygons and multipolygons render perfectly, point works but with a visual discrepancy, and I don’t think linestring is supported at all.

Icon map is a very versatile visual, I hope the author will release an official update and fix the rendering bugs and add an option for color per category.

Bigquery GIS is very powerful and easy to use, the documentation is excellent, I wished only they release a smaller public GIS dataset to play with.

## How to Export data from PowerQuery to BigQuery

Today was playing with a report in PowerBI and I got this idea of exporting data to BigQuery from PowerQuery, let me tell you something, it is very easy and it works rather well, PowerQuery is an amazing technology ( and it is free).

in PowerBI,you can export from R or Python visuals but there are a limitation of 150K rows, but if you use PowerQuery, there is no limitation ( I tried with a table of 23 Millions records and it works)

here is the code using Python, but you can use R

`import pandas as pdimport os from google.cloud import bigquery dataset['SETTLEMENTDATE']=pd.to_datetime(dataset['SETTLEMENTDATE']) dataset['INITIALMW']=pd.to_numeric(dataset['INITIALMW']) os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "C:/BigQuery/test-990c2f64d86d.json" client = bigquery.Client() dataset_ref = client.dataset('work') table_ref = dataset_ref.table('test') job_config = bigquery.LoadJobConfig() job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE job_config.schema = [ bigquery.SchemaField("SETTLEMENTDATE", "TIMESTAMP"), bigquery.SchemaField("DUID", "STRING"), bigquery.SchemaField("INITIALMW", "FLOAT"), bigquery.SchemaField("UNIT", "STRING")] job = client.load_table_from_dataframe(dataset, table_ref, job_config=job_config) job.result() # Waits for table load to complete.`

interesting after the step in Python we get a table, simply expand it

here is the total rows of the table in PowerBI

the results in BigQuery

ok, PowerQuery flow can execute many times, it is a black magic knowledge that’s only a handful of people knows, but in this cases, it does not matter, the BigQuery job truncate the tables every time, so there is no risk of data duplication.

probably you may ask why do that if there are a lot of data preparation tools that natively support BigQuery, based on my own experience, most of my data sources are Excel files and PowerQuery is just very powerful and versatile specially if you deal with “dirty” format.