ongxuanhong / de03-trino-dbt-spark-everything-everywhere-all-at-onceLinks
☆16Updated last year
Alternatives and similar repositories for de03-trino-dbt-spark-everything-everywhere-all-at-once
Users that are interested in de03-trino-dbt-spark-everything-everywhere-all-at-once are comparing it to the libraries listed below
Sorting:
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆35Updated last year
- ☆14Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆58Updated last year
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆72Updated last year
- Code for dbt tutorial☆156Updated 3 weeks ago
- End-to-end data platform leveraging the Modern data stack☆49Updated last year
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆62Updated last month
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆84Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆119Updated 3 months ago
- End to end data engineering project☆56Updated 2 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆77Updated 2 years ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 10 months ago
- ☆15Updated 2 years ago
- In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our …☆32Updated last year
- Simple stream processing pipeline☆102Updated last year
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆142Updated 11 months ago
- Data lake, data warehouse on GCP☆56Updated 3 years ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated 9 months ago
- ☆18Updated 10 months ago
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆41Updated 8 months ago
- Creation of a data lakehouse and an ELT pipeline to enable the efficient analysis and use of data☆46Updated last year
- build dw with dbt☆46Updated 8 months ago
- Building Data Lakehouse by open source technology. Support end to end data pipeline, from source data on AWS S3 to Lakehouse, visualize a…☆30Updated last year
- Code for blog at: https://www.startdataengineering.com/post/docker-for-de/☆38Updated last year
- ☆18Updated last year
- A custom end-to-end analytics platform for customer churn☆12Updated last month
- A portable Datamart and Business Intelligence suite built with Docker, Airflow, dbt, PostgreSQL and Superset☆43Updated 7 months ago
- Full stack data engineering tools and infrastructure set-up☆53Updated 4 years ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 11 months ago