harrydevforlife / building-lakehouse
Building Data Lakehouse by open source technology. Support end to end data pipeline, from source data on AWS S3 to Lakehouse, visualize and recommend app.
☆20Updated 10 months ago
Alternatives and similar repositories for building-lakehouse:
Users that are interested in building-lakehouse are comparing it to the libraries listed below
- build dw with dbt☆36Updated 3 months ago
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆31Updated 11 months ago
- ☆15Updated last year
- ☆17Updated 6 months ago
- A custom end-to-end analytics platform for customer churn☆10Updated 3 weeks ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆64Updated 4 months ago
- Sample code to collect Apache Iceberg metrics for table monitoring☆24Updated 6 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆60Updated last year
- A portable Datamart and Business Intelligence suite built with Docker, Airflow, dbt, PostgreSQL and Superset☆38Updated 3 months ago
- dbt (data build tool) projects targeting AWS analytics services (redshift, glue, emr, athena) and open table formats☆29Updated last year
- Building a Data Pipeline with an Open Source Stack☆45Updated 7 months ago
- Code snippets for Data Engineering Design Patterns book☆69Updated 2 weeks ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 6 months ago
- Delta-Lake, ETL, Spark, Airflow☆46Updated 2 years ago
- ☆15Updated last year
- Quick Guides from Dremio on Several topics☆67Updated last month
- Open source stack lakehouse☆25Updated 11 months ago
- 📡 Real-time data pipeline with Kafka, Flink, Iceberg, Trino, MinIO, and Superset. Ideal for learning data systems.☆36Updated last month
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆53Updated 4 months ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆63Updated last year
- Library to convert DBT manifest metadata to Airflow tasks☆48Updated 11 months ago
- Cost Efficient Data Pipelines with DuckDB☆49Updated 6 months ago
- A write-audit-publish implementation on a data lake without the JVM☆46Updated 6 months ago
- A demonstration of an ELT (Extract, Load, Transform) pipeline☆29Updated last year
- Apache Hive Metastore as a Standalone server in Docker☆68Updated 5 months ago
- Cloned by the `dbt init` task☆60Updated 9 months ago
- reating a modern data pipeline using a combination of Terraform, AWS Lambda and S3, Snowflake, DBT, Mage AI, and Dash.☆14Updated last year
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆50Updated last year
- A simple Data Engineering solution for testing or education purposes. You only need to know SQL and Python to understand this project. Da…☆25Updated 2 years ago