harrydevforlife / building-lakehouseLinks
Building Data Lakehouse by open source technology. Support end to end data pipeline, from source data on AWS S3 to Lakehouse, visualize and recommend app.
☆34Updated this week
Alternatives and similar repositories for building-lakehouse
Users that are interested in building-lakehouse are comparing it to the libraries listed below
Sorting:
- build dw with dbt☆49Updated last year
- Repo for everything open table formats (Iceberg, Hudi, Delta Lake) and the overall Lakehouse architecture☆125Updated last month
- Cost Efficient Data Pipelines with DuckDB☆60Updated 7 months ago
- Installer for DataKitchen's Open Source Data Observability Products. Data breaks. Servers break. Your toolchain breaks. Ensure your team …☆129Updated last month
- Open source stack lakehouse☆25Updated last year
- Quick Guides from Dremio on Several topics☆79Updated last month
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- Data Product Portal created by Dataminded☆196Updated last week
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆44Updated last year
- Code snippets for Data Engineering Design Patterns book☆294Updated 9 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆75Updated 2 years ago
- New generation opensource data stack☆76Updated 3 years ago
- Code for dbt tutorial☆165Updated 3 months ago
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆257Updated last week
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆277Updated 2 months ago
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆47Updated last year
- End-to-end data platform leveraging the Modern data stack☆52Updated last year
- Delta Lake examples☆235Updated last year
- Delta Lake helper methods. No Spark dependency.☆23Updated last year
- Building a Data Pipeline with an Open Source Stack☆55Updated 5 months ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆79Updated 2 years ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆79Updated this week
- A portable Datamart and Business Intelligence suite built with Docker, Airflow, dbt, duckdb and Superset☆46Updated last week
- A Docker Compose template that builds a interactive development environment for PySpark with Jupyter Lab, MinIO as object storage, Hive M…☆47Updated last year
- ☆80Updated last year
- A demonstration of an ELT (Extract, Load, Transform) pipeline☆31Updated last year
- A write-audit-publish implementation on a data lake without the JVM☆45Updated last year
- A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆55Updated 2 months ago
- Cloned by the `dbt init` task☆62Updated last year
- 📡 Real-time data pipeline with Kafka, Flink, Iceberg, Trino, MinIO, and Superset. Ideal for learning data systems.☆58Updated 11 months ago