Gabriel Okemwaoneokemwag.hashnode.dev·Aug 12, 2024Prefect for Workflow Orchestration🚀 Revolutionizing Data Workflows with Prefect: The Future of Automation In today's data-driven world, efficient workflow management is crucial for businesses to stay competitive. Enter Prefect, a cutting-edge workflow automation tool that's transfor...DiscussPrefect
Manuel Schmidbauerdeltaload.hashnode.dev·May 25, 2024Efficient Shopify Data Transfer with dlthubGoal of the Project The goal of this project is to create a streamlined pipeline that loads data from the Shopify REST API into a Google Cloud Storage Bucket. The pipeline should be managed by Prefect, enabling automated and incremental data loads in...Discuss·48 readsDLT
Manuel Schmidbauerdeltaload.hashnode.dev·May 23, 2024Step-by-Step DLT Prefect Cloud Push Deployment in Google CloudCreate a new Cloud Push Worker in Google Cloud Read the tutorial on creating a Cloud Push Worker in Google Cloud in the Prefect Docs Docs. Ensure that you install Docker as well. Create a virtual environment for the project python3.10 -m venv vm_d...DiscussDLT
Deepson Shresthablog.deepsonshrestha.com.np·May 23, 2023Another Valorant Stats Tracking Dashboard?Yes and No. Despite having a plethora of high-quality stats trackers for Valorant, you might be wondering why I decided to build one from scratch (of course excluding the API). Well, the answer is pretty simple. As a Data practitioner and a Netizen,...Discuss·97 readsValorant
Ajala marvellousmadeofajala.hashnode.dev·Jun 19, 2022FeaturedBuilding a memory-efficient ETL to convert JSON to CSV in Python with prefectHave you ever built a simple ETL (Extract-Transform-Load) pipeline in python to convert a JSON file to CSV? Easy peasy, right????? But have you ever tried to transform a large dataset (Large here meaning a dataset in which the dataset size is larger ...Discuss·52 likes·657 readsPython