Stefen-Taime / modern-data-pipelineLinks
reating a modern data pipeline using a combination of Terraform, AWS Lambda and S3, Snowflake, DBT, Mage AI, and Dash.
☆14Updated 2 years ago
Alternatives and similar repositories for modern-data-pipeline
Users that are interested in modern-data-pipeline are comparing it to the libraries listed below
Sorting:
- build dw with dbt☆46Updated 8 months ago
- ☆41Updated 11 months ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 10 months ago
- The goal of this project is to offer an AWS EMR template using Spot Fleet and On-Demand Instances that you can use quickly. Just focus on…☆27Updated 3 years ago
- A modern ELT demo using airbyte, dbt, snowflake and dagster☆28Updated 2 years ago
- A collection of data engineering projects: data modeling, ETL pipelines, data lakes, infrastructure configuration on AWS, data warehousin…☆15Updated 4 years ago
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆35Updated last year
- ☆10Updated 3 years ago
- Full stack data engineering tools and infrastructure set-up☆53Updated 4 years ago
- Code for "Advanced data transformations in SQL" free live workshop☆82Updated last month
- ☆18Updated 10 months ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆55Updated 2 years ago
- This repository will help you to learn about databricks concept with the help of examples. It will include all the important topics which…☆99Updated 10 months ago
- Developed an ETL pipeline for a Data Lake that extracts data from S3, processes the data using Spark, and loads the data back into S3 as …☆16Updated 5 years ago
- Repository for Data Engineering Interview Series☆32Updated 8 months ago
- In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our …☆32Updated last year
- End-to-End ELT data pipeline with Postgres, Airbyte, dbt, Dagster, Snowflake and Metabase☆11Updated last year
- End to end data engineering project☆56Updated 2 years ago
- Cloned by the `dbt init` task☆60Updated last year
- This is a real-life, high throughput streaming ELT data pipeline for ecommerce☆13Updated 2 years ago
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆66Updated last week
- PySpark Cheatsheet☆36Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- This repo contains "Databricks Certified Data Engineer Professional" Questions and related docs.☆81Updated 10 months ago
- This project focuses on building a robust data pipeline using Apache Airflow to automate the ingestion of weather data from the OpenWeath…☆21Updated 2 years ago
- Code for blog at: https://www.startdataengineering.com/post/docker-for-de/☆38Updated last year
- Building Data Lakehouse by open source technology. Support end to end data pipeline, from source data on AWS S3 to Lakehouse, visualize a…☆30Updated last year
- Simple ETL pipeline using Python☆26Updated 2 years ago
- An End-to-End ETL data pipeline that leverages pyspark parallel processing to process about 25 million rows of data coming from a SaaS ap…☆25Updated 2 years ago
- Dockerizing an Apache Spark Standalone Cluster☆43Updated 2 years ago