Update 26-Oct-2024 : using DuckDB 1.1.2, you don’t need to to mount a lakehouse to the notebooks and add support for reading Onelake Lakehouse outside of Fabric . currently it is read only, for writing you need Delta_rs
it is a very simple Python script how you can attach a Lakehouse to DuckDB in a Fabric notebook (you can use the same logic for Polars,Daft etc)
it is read only and will create views based on your existing Delta tables, it assumes you are using schemas, but you can edit it for simpler use cases, or if you have a lot of tables, maybe it is more practical just to attach one specific schema.
import duckdb
from glob import glob
def attach_lakehouse(base_path):
list_tables = glob(f"{base_path}*/*/", recursive=True)
sql_schema = set()
sql_statements = set()
for table_path in list_tables:
parts = table_path.strip("/").split("/")
schema = parts[-2]
table = parts[-1]
sql_schema.add(f"CREATE SCHEMA IF NOT EXISTS {schema};")
sql_statements.add(f"CREATE OR REPLACE VIEW {schema}.{table} AS SELECT * FROM delta_scan('{table_path}');")
duckdb.sql(" ".join(sql_schema))
duckdb.sql(" ".join(sql_statements))
display(duckdb.sql("SHOW ALL TABLES").df())
attach_lakehouse('/lakehouse/default/Tables/')
and here is an example

now you can read and joins any tables even from different schemas

Notice Delta support in DuckDB is not very performant at this stage, compared to pure Parquet, but I suspect we will see a massive improvement in the next version 1.1