ongxuanhong / de02-pyspark-optimizationLinks
☆14Updated 2 years ago
Alternatives and similar repositories for de02-pyspark-optimization
Users that are interested in de02-pyspark-optimization are comparing it to the libraries listed below
Sorting:
- Code for dbt tutorial☆157Updated last year
- Simple stream processing pipeline☆103Updated 11 months ago
- ☆16Updated last year
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆71Updated last year
- End to end data engineering project☆56Updated 2 years ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆58Updated last year
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆140Updated 10 months ago
- Code snippets for Data Engineering Design Patterns book☆116Updated 2 months ago
- Creation of a data lakehouse and an ELT pipeline to enable the efficient analysis and use of data☆46Updated last year
- Delta Lake examples☆225Updated 7 months ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- Trino dbt demo project to mix and load BigQuery data with and in a local PostgreSQL database☆75Updated 3 years ago
- Quick Guides from Dremio on Several topics☆71Updated last week
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated 8 months ago
- ☆38Updated 2 years ago
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆65Updated last week
- Realtime Data Engineering Project☆30Updated 4 months ago
- Local Environment to Practice Data Engineering☆142Updated 5 months ago
- Building a Modern Data Lake with Minio, Spark, Airflow via Docker.☆20Updated last year
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆19Updated 8 months ago
- A custom end-to-end analytics platform for customer churn☆12Updated 3 weeks ago
- Building a Data Pipeline with an Open Source Stack☆54Updated 11 months ago
- This project shows how to capture changes from postgres database and stream them into kafka☆36Updated last year
- Dagster University courses☆85Updated last week
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆40Updated 7 months ago
- Sample project to demonstrate data engineering best practices☆191Updated last year
- Course notes for the Astronomer Certification DAG Authoring for Apache Airflow☆53Updated last year
- A repository of sample code to show data quality checking best practices using Airflow.☆77Updated 2 years ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- ☆85Updated 4 months ago