I believe I have found the solution. It is possible to parameterize %%configure like this
%%configure
{ "defaultLakehouse": { "name": { "parameterName": "defaultLakehouseName", "defaultValue": "FourthLakehouse" }, "id": { "parameterName": "defaultLakehouseId", "defaultValue": "773faa37-826f-4f9b-830f-e2a7a23e3903" } } }
I am using it to programmatically drop unnecessary tables in various lakehouses, but I also think I will use it to apply optimize, vacuum and vorder operations to tables. hope it helps
Great blog!
I'm wondering, is it possible to switch the default Lakehouse multiple times within a Notebook run?
If I understand correctly, changing the default Lakehouse by using the '%% configure' approach will only work properly if this is done in the first cell in the Notebook. I guess that means we can only set the default Lakehouse once in a Notebook run.
Some users want to be able to switch default Lakehouse programmatically multiple times within a Notebook run, in order to use Spark SQL on multiple Lakehouses within the same Notebook run.
Michal
Hi Sandeep, thanks a lot. I have read both blogs. I am interested in what's your approach when promoting notebook to another environment (feature branch -> dev -> test -> prod). I do not like UI assignment of lakehouses, cause its then hardcoded in notebook metadata. :D I use configure to setup default lakehouse, but how to assign another one ? I am not sure about mounting it. cause you can not change it when moving to another stage.
Thanks