Ingest data into Delta Lake using Auto Loader in Databricks
Those who work with data, have once in a while faced or thought of this requirement:
Identifying new files that are placed in our data source directory and processing those files only.
One way of doing this is by using Spark Structured Streaming ...
aamir.hashnode.dev2 min read