ongxuanhong / de02-pyspark-optimizationLinks
☆14Updated 2 years ago
Alternatives and similar repositories for de02-pyspark-optimization
Users that are interested in de02-pyspark-optimization are comparing it to the libraries listed below
Sorting:
- ☆16Updated last year
- Code for dbt tutorial☆156Updated 3 weeks ago
- Simple stream processing pipeline☆102Updated last year
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆72Updated last year
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆58Updated last year
- Code snippets for Data Engineering Design Patterns book☆119Updated 3 months ago
- This project demonstrates how to use Apache Airflow to submit jobs to Apache spark cluster in different programming laguages using Python…☆44Updated last year
- End to end data engineering project☆56Updated 2 years ago
- Delta Lake examples☆225Updated 8 months ago
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated 9 months ago
- Creation of a data lakehouse and an ELT pipeline to enable the efficient analysis and use of data☆46Updated last year
- A repository of sample code to show data quality checking best practices using Airflow.☆77Updated 2 years ago
- In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our …☆32Updated last year
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆91Updated 5 years ago
- Local Environment to Practice Data Engineering☆142Updated 5 months ago
- Repo for everything open table formats (Iceberg, Hudi, Delta Lake) and the overall Lakehouse architecture☆83Updated this week
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆41Updated 8 months ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 11 months ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆33Updated 4 years ago
- ☆26Updated last year
- Near real time ETL to populate a dashboard.☆72Updated last year
- A custom end-to-end analytics platform for customer churn☆12Updated last month
- This project shows how to capture changes from postgres database and stream them into kafka☆36Updated last year
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆66Updated last week
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆41Updated last year
- Delta Lake Documentation☆49Updated last year
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆35Updated last year
- 📡 Real-time data pipeline with Kafka, Flink, Iceberg, Trino, MinIO, and Superset. Ideal for learning data systems.☆45Updated 5 months ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago