thanhENC / e2e-data-platformLinks
End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore, Minio, Postgres)
☆43Updated 11 months ago
Alternatives and similar repositories for e2e-data-platform
Users that are interested in e2e-data-platform are comparing it to the libraries listed below
Sorting:
- Local Environment to Practice Data Engineering☆143Updated 8 months ago
- Code snippets for Data Engineering Design Patterns book☆182Updated 5 months ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 2 months ago
- In this repository we store all materials for dlt workshops, courses, etc.☆225Updated last week
- build dw with dbt☆47Updated 10 months ago
- Code for dbt tutorial☆161Updated last week
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆40Updated last year
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆78Updated 2 years ago
- Code for "Efficient Data Processing in Spark" Course☆338Updated 3 months ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆276Updated last year
- ☆156Updated 3 weeks ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆149Updated last year
- Sample project to demonstrate data engineering best practices☆196Updated last year
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆74Updated 2 years ago
- End-to-end data platform leveraging the Modern data stack☆51Updated last year
- Code for blog at: https://www.startdataengineering.com/post/docker-for-de/☆39Updated last year
- End to end data engineering project☆57Updated 2 years ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆64Updated 4 months ago
- ☆209Updated 7 months ago
- Simple stream processing pipeline☆108Updated last year
- End-to-end data pipeline that ingests, processes, and stores data. It uses Apache Airflow to schedule scripts that fetch data from an API…☆19Updated last year
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated last week
- ☆120Updated last month
- Notebooks to learn Databricks Lakehouse Platform☆35Updated 3 weeks ago
- Code for my "Efficient Data Processing in SQL" book.☆59Updated last year
- A demonstration of an ELT (Extract, Load, Transform) pipeline☆30Updated last year
- Code for "Advanced data transformations in SQL" free live workshop☆84Updated 4 months ago
- A curated list of awesome public DBT projects☆148Updated last year
- A tutorial for the Great Expectations library.☆71Updated 4 years ago
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated last month