pyjaime / docker-airflow-spark
Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks
☆21Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for docker-airflow-spark
- ☆31Updated last year
- A course by DataTalks Club that covers Spark, Kafka, Docker, Airflow, Terraform, DBT, Big Query etc☆11Updated 2 years ago
- ☆37Updated 4 months ago
- Delta-Lake, ETL, Spark, Airflow☆44Updated 2 years ago
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆37Updated last year
- Docker with Airflow and Spark standalone cluster☆242Updated last year
- ☆86Updated 2 years ago
- Building a Modern Data Lake with Minio, Spark, Airflow via Docker.☆15Updated 5 months ago
- End to end data engineering project☆49Updated 2 years ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆58Updated last year
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆39Updated 3 years ago
- Near real time ETL to populate a dashboard.☆70Updated 4 months ago
- Ultimate guide for mastering Spark Performance Tuning and Optimization concepts and for preparing for Data Engineering interviews☆67Updated 5 months ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆64Updated 3 months ago
- The goal of this project is to build a docker cluster that gives access to Hadoop, HDFS, Hive, PySpark, Sqoop, Airflow, Kafka, Flume, Pos…☆53Updated last year
- Simple stream processing pipeline☆91Updated 4 months ago
- Code for my "Efficient Data Processing in SQL" book.☆49Updated 3 months ago
- ☆27Updated 11 months ago
- Example repo to create end to end tests for data pipeline.☆21Updated 4 months ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆128Updated last year
- Project for "Data pipeline design patterns" blog.☆41Updated 3 months ago
- ☆68Updated 5 months ago
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆35Updated last year
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆28Updated 10 months ago
- Simple ETL pipeline using Python☆20Updated last year
- ☆43Updated 2 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆99Updated 3 years ago
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆18Updated last month
- 😈Complete End to End ETL Pipeline with Spark, Airflow, & AWS☆43Updated 5 years ago