ongxuanhong / de03-trino-dbt-spark-everything-everywhere-all-at-onceLinks
☆17Updated last year
Alternatives and similar repositories for de03-trino-dbt-spark-everything-everywhere-all-at-once
Users that are interested in de03-trino-dbt-spark-everything-everywhere-all-at-once are comparing it to the libraries listed below
Sorting:
- Code for dbt tutorial☆162Updated last month
- Building a Data Pipeline with an Open Source Stack☆54Updated 3 months ago
- Local Environment to Practice Data Engineering☆141Updated 9 months ago
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆40Updated last year
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆150Updated last year
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated 2 months ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆65Updated 5 months ago
- Docker with Airflow and Spark standalone cluster☆260Updated 2 years ago
- End to end data engineering project☆57Updated 2 years ago
- ☆40Updated 2 years ago
- Simple stream processing pipeline☆110Updated last year
- Code snippets for Data Engineering Design Patterns book☆232Updated 7 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆74Updated 2 years ago
- ☆90Updated 8 months ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆61Updated 2 years ago
- Sample project to demonstrate data engineering best practices☆198Updated last year
- ☆160Updated last month
- build dw with dbt☆46Updated 11 months ago
- Execution of DBT models using Apache Airflow through Docker Compose☆121Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆44Updated last year
- In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our …☆35Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆268Updated last week
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆74Updated last week
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆279Updated last year
- A repository of sample code to accompany our blog post on Airflow and dbt.☆178Updated 2 years ago
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆140Updated 3 months ago
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆41Updated last year
- End-to-end data platform leveraging the Modern data stack☆51Updated last year
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆209Updated this week