pyjaime / docker-airflow-sparkLinks
Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks
☆24Updated 3 years ago
Alternatives and similar repositories for docker-airflow-spark
Users that are interested in docker-airflow-spark are comparing it to the libraries listed below
Sorting:
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- ☆39Updated 2 years ago
- Project for "Data pipeline design patterns" blog.☆45Updated 10 months ago
- Docker with Airflow and Spark standalone cluster☆258Updated last year
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆42Updated last year
- End to end data engineering project☆56Updated 2 years ago
- Building a Modern Data Lake with Minio, Spark, Airflow via Docker.☆20Updated last year
- Simple stream processing pipeline☆102Updated last year
- Code snippets for Data Engineering Design Patterns book☆119Updated 3 months ago
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆36Updated last year
- ☆41Updated 11 months ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- Near real time ETL to populate a dashboard.☆72Updated last year
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 10 months ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆62Updated last year
- Example repo to create end to end tests for data pipeline.☆25Updated last year
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆71Updated last year
- ☆28Updated last year
- A course by DataTalks Club that covers Spark, Kafka, Docker, Airflow, Terraform, DBT, Big Query etc☆13Updated 3 years ago
- Code for dbt tutorial☆156Updated 3 weeks ago
- Local Environment to Practice Data Engineering☆142Updated 5 months ago
- ☆35Updated 2 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆140Updated last year
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated 9 months ago
- Sample project to demonstrate data engineering best practices☆194Updated last year
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆41Updated last year
- build dw with dbt☆46Updated 8 months ago
- An End-to-End ETL data pipeline that leverages pyspark parallel processing to process about 25 million rows of data coming from a SaaS ap…☆25Updated 2 years ago
- This project demonstrates how to use Apache Airflow to submit jobs to Apache spark cluster in different programming laguages using Python…☆44Updated last year
- ☆21Updated 2 years ago