dominikhei / Local-Data-LakeHouse
Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testing.
☆63Updated last year
Alternatives and similar repositories for Local-Data-LakeHouse:
Users that are interested in Local-Data-LakeHouse are comparing it to the libraries listed below
- Code for dbt tutorial☆153Updated 9 months ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆132Updated 8 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆64Updated 6 months ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆193Updated last month
- Quick Guides from Dremio on Several topics☆69Updated 2 months ago
- Scalefree's dbt package for a Data Vault 2.0 implementation congruent to the original Data Vault 2.0 definition by Dan Linstedt including…☆151Updated this week
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆18Updated 6 months ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆59Updated 5 months ago
- Showcase of advanced use cases relating to CI in dbt☆74Updated last week
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆32Updated last year
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆222Updated last month
- Delta Lake examples☆218Updated 5 months ago
- A curated list of awesome public DBT projects☆118Updated last year
- 🥪🦘 An open source sandbox project exploring dbt workflows via a fictional sandwich shop's data.☆153Updated last month
- Code snippets for Data Engineering Design Patterns book☆74Updated last month
- Code for "Efficient Data Processing in Spark" Course☆287Updated 5 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆238Updated last month
- End to end data engineering project☆53Updated 2 years ago
- A Micosoft Power BI Custom Connector allowing you to import Trino data into Power BI.☆67Updated 2 months ago
- Dagster University courses☆69Updated this week
- A repository of sample code to accompany our blog post on Airflow and dbt.☆170Updated last year
- Apache Hive Metastore as a Standalone server in Docker☆68Updated 7 months ago
- Trino dbt demo project to mix and load BigQuery data with and in a local PostgreSQL database☆72Updated 3 years ago
- ☆74Updated 5 months ago
- Execution of DBT models using Apache Airflow through Docker Compose☆116Updated 2 years ago
- Example repository showing how to build a data platform with Prefect, dbt and Snowflake☆99Updated 2 years ago
- Sample project to demonstrate data engineering best practices☆181Updated last year
- A lightweight Python-based tool for extracting and analyzing data column lineage for dbt projects☆143Updated 3 months ago
- dbt Cloud command line interface (CLI)☆75Updated last year