Rosette Kastime-travel-in-microsoft-fabric.hashnode.dev·Dec 21, 2024Historical Data StorageQuerying Historical Data using Time Travel 1 Time Travel Syntax Use the @ symbol followed by a timestamp to specify the desired time point in your queries. 2 Example Query Select * from SalesData where OrderDate @ '2024-01-01' to query data from Janu...Data Retantion
Sachin Nandanwarwww.azureguru.net·Dec 18, 2024Demystifying roles and access in Microsoft Fabric lake houseIn Microsoft Fabric, SQL analytics endpoint is automatically generated for a Lakehouse. With SQL analytics endpoint a user can view the Lake house data using TSQL. When it comes to lakehouse roles and access the lakehouse security model is not comple...29 readsmicrosoftfabric
Sandeep Pawarfabric.guru·Dec 7, 2024Delta Lake Tables For Optimal Direct Lake Performance In Fabric Python NotebookIn my last blog, I showed how to use Polars in a Fabric Python notebook to read, write and transform data. What I did not cover was how to write a Delta Lake table in a Python notebook that’s optimal for DirectLake performance. All engines in Fabric ...2 likes·1.6K readsmicrosoftfabric
Peer Grønneruppeerinsights.hashnode.dev·Oct 28, 2024Automating Fabric: Kickstart your Fabric Data Platform setupSetting up and managing workspaces in Microsoft Fabric can be a time-consuming task, especially when you need multiple workspaces for various stages of the data lifecycle across different environments. This blog post demonstrates how to streamline yo...477 readsmicrosoftfabric
Sandeep Pawarfabric.guru·Sep 19, 2024Programmatically Removing & Updating Default Lakehouse Of A Fabric NotebookI have written about default lakehouse of a Fabric notebook before here and here. However, unless you used the notebook API, there was no easy/quick way of removing all/selective lakehouses or updating the default lakehouse of a notebook. But thanks ...1.5K readsnotebookutils
Sandeep Pawarfabric.guru·Sep 4, 2024Querying SQL Endpoint of Fabric Lakehouse/Warehouse In A Notebook With T-SQLI am not sharing anything new. The spark data warehouse connector has been available for a couple months now. It had some bugs, but it seems to be stable now. This connector allows you to query the lakehouse or warehouse endpoint in the Fabric notebo...2.3K readssql endpoint
Sandeep Pawarfabric.guru·Aug 26, 2024Quick Tip : Using NotebookUtils To Get Fabric Lakehouse PropertiesNotebookutils in Fabric now has getWithProperties() method to get the properties of a lakehouse. Earlier the get() method only returned abfs path and not the SQL endpoint. By default it returns the JSON: //sample output {'id': '<>', 'type': 'Lake...1.4K readsmicrosoftfabric
Sandeep Pawarfabric.guru·Jul 30, 2024Using fsspec To Define OneLake Filesystem in FabricI mentioned on X the other day that, like other filesystem backends such as S3 and GCS, you can use fsspec to define the OneLake filesystem too. In this blog, I will explain how to define it and why it's important to know about it. https://x.com/Pawa...2 likes·785 readsdaft
Sandeep Pawarfabric.guru·Jul 24, 2024How to Attach a Default Lakehouse to a Notebook in FabricI wrote a blog post a while ago on mounting a lakehouse (or generally speaking a storage location) to all nodes in a Fabric spark notebook. This allows you to use the File API file path from the mounted lakehouse. Mounting a lakehouse using mssparkut...1 like·2.9K readsmicrosoftfabric
Sandeep Pawarfabric.guru·Jul 19, 2024Calculating The Size of The Dataflow Gen2 Staging LakehousesMy friend Alex Powers (PM, Fabric CAT) wrote a blog post about cleaning the staging lakehouses generated by Dataflow Gen2. Before reading this blog, go ahead and read his blog first on the mechanics of it and the whys. Note that these are system gene...659 readsmssparkutils