Gabriela CaldasforByte-sized Journeybyte-sizedjourneys.hashnode.dev·Nov 21, 2024Designing Data Pipelines for Success: Best Practices for Scalability and Data QualityIn today’s world, businesses need accurate and readily available insights to stay competitive. Data engineering plays a crucial role in creating the infrastructure that makes this possible. From building efficient data pipelines to ensuring data qual...Discussdata-engineering
Parasai-powered-personal-assistants.hashnode.dev·Nov 18, 2024The Future of ETL Workflows: Automated and AI-Powered Approaches in 2025Extract, Transform, and Load (ETL) workflows have long been the backbone of data integration, enabling organizations to move, clean, and prepare data for analysis. However, with the exponential growth of data and the rise of artificial intelligence (...DiscussArtificial Intelligence
Nhlahla Sibiyadataandinsight.hashnode.dev·Nov 17, 2024Building a Well-Organized Grocery Store: A Metaphor for Your Data WarehouseIn today’s data-driven world, businesses rely on well-structured data warehouses to make informed decisions. But building a data warehouse can seem daunting, so let’s simplify things with a familiar metaphor: a well-organized grocery store. By compar...DiscussData Science
Arpit Tyagidataminds.hashnode.dev·Nov 12, 2024Copy data from the Data Lake to the SQL Database, deleting any existing data each time before loading the new data:Step 1: Check the existing data in the employee table: Step 2: Change the data manually in the csv file in data lake: we need to check whether this change will be visible in the table or not (definitely at the end). Step 3: Settings in “Sink” opti...Discuss·7 likesAzure Data FactoryAzure Data Factory
Dorian Sabitovsfappsinfo.hashnode.dev·Nov 12, 2024Optimizing Salesforce Data Integration: Tools and Best PracticesIntroduction: Why Do You Need a Data Integration? For businesses that rely on Salesforce, data integration plays a key role in ensuring that information is accurate, accessible, and actionable. Many organizations use Salesforce to manage customer dat...Discussblog
Shobhit SharmaforSystemDesign.blogsystemdesign.blog·Nov 7, 2024Running ELK Stack in a Minikube Kubernetes Environment: The Ultimate GuideOverview of ELK Stack (Elasticsearch, Logstash, Kibana) The ELK Stack is a popular set of tools used for searching, analyzing, and visualizing log data in real time. It is widely used in logging, monitoring, and observability use cases. Elasticsearc...Discuss·52 readsELKelasticsearch
Arpit Tyagidataminds.hashnode.dev·Oct 24, 2024How to Transfer All SQL Database Tables to Azure Data Lake in One Go?Step 1: Create an instance of Azure Data Factory: Step 2: Set up Linked Services: Step 3: Create a dataset for both (source-SQL Database and destination-Azure Data Lake): Step 4: Build the Data Pipeline with the help of datasets: First I will us...Discuss·10 likesAzure Data Factorydata-engineering
Arpit Tyagidataminds.hashnode.dev·Oct 23, 2024How to Copy SQL Database Tables Post-Join in Azure Data Lake via ADF!!Step 1: Create an instance of Azure Data Factory: Step 2: Set up Linked Services: Step 3: Create a dataset for both (source-SQL Database and destination-Azure Data Lake): Step 4: Build the Data Pipeline with the help of datasets: Step 5: Test ...Discuss·10 likesAzure Data FactoryAzure
Arpit Tyagidataminds.hashnode.dev·Oct 23, 2024Can we copy SQL Table to Pipe-Separated Files? - Let us see!!Step 1: Create an instance of Azure Data Factory: Step 2: Set up Linked Services: Step 3: Create a dataset for both (source-SQL Database and destination-Azure Data Lake): This is the most important step because we will select the pipe as a delim...Discuss·10 likesAzure Data FactoryAzure
Arpit Tyagidataminds.hashnode.dev·Oct 23, 2024Step-by-Step Guide to "Export SQL Database Table" to "Azure Data Lake" as JSON with Azure Data Factory.Step 1: Create an instance of Azure Data Factory Step 2: Set up Linked Services Continue following the instructions shown in the images. Step 3: Create a dataset for both the SQL database and the Data Lake JSON file format. Step 4: Build the...Discuss·10 likesAzure Data FactoryData Science