dominikhei / Local-Data-LakeHouseLinks
Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testing.
☆72Updated last year
Alternatives and similar repositories for Local-Data-LakeHouse
Users that are interested in Local-Data-LakeHouse are comparing it to the libraries listed below
Sorting:
- Code for dbt tutorial☆156Updated 2 weeks ago
- Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆117Updated 2 months ago
- Quick Guides from Dremio on Several topics☆71Updated 3 weeks ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated 9 months ago
- Quickstart for any service☆154Updated this week
- Simple stream processing pipeline☆102Updated last year
- Delta Lake examples☆225Updated 8 months ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆35Updated last year
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆142Updated 11 months ago
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆232Updated 4 months ago
- End to end data engineering project☆56Updated 2 years ago
- New generation opensource data stack☆68Updated 3 years ago
- A demonstration of an ELT (Extract, Load, Transform) pipeline☆29Updated last year
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated 9 months ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆58Updated last year
- This repository serves as a comprehensive guide to effective data modeling and robust data quality assurance using popular open-source to…☆30Updated last year
- build dw with dbt☆46Updated 8 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆188Updated last week
- Dagster University courses☆88Updated 2 weeks ago
- Delta Lake Documentation☆49Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆254Updated 4 months ago
- Apache Hive Metastore as a Standalone server in Docker☆78Updated 10 months ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆199Updated this week
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆62Updated last month
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆265Updated 11 months ago
- Execution of DBT models using Apache Airflow through Docker Compose☆116Updated 2 years ago
- ☆80Updated 8 months ago
- Repo for everything open table formats (Iceberg, Hudi, Delta Lake) and the overall Lakehouse architecture☆83Updated this week
- Delta Lake helper methods in PySpark☆326Updated 9 months ago