pyjaime / docker-airflow-sparkLinks
Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks
☆24Updated 3 years ago
Alternatives and similar repositories for docker-airflow-spark
Users that are interested in docker-airflow-spark are comparing it to the libraries listed below
Sorting:
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆69Updated last year
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- ☆38Updated 2 years ago
- ☆40Updated 10 months ago
- End to end data engineering project☆56Updated 2 years ago
- A course by DataTalks Club that covers Spark, Kafka, Docker, Airflow, Terraform, DBT, Big Query etc☆13Updated 3 years ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 9 months ago
- Code for dbt tutorial☆157Updated last year
- Code snippets for Data Engineering Design Patterns book☆116Updated 2 months ago
- ☆28Updated last year
- Building a Modern Data Lake with Minio, Spark, Airflow via Docker.☆20Updated last year
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆60Updated last year
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆35Updated last year
- This repo contains "Databricks Certified Data Engineer Professional" Questions and related docs.☆74Updated 9 months ago
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆65Updated last week
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆42Updated last year
- A repository of sample code to show data quality checking best practices using Airflow.☆77Updated 2 years ago
- Simple stream processing pipeline☆103Updated 11 months ago
- Sample project to demonstrate data engineering best practices☆191Updated last year
- Simple ETL pipeline using Python☆26Updated 2 years ago
- Local Environment to Practice Data Engineering☆142Updated 5 months ago
- Docker with Airflow and Spark standalone cluster☆256Updated last year
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆35Updated last year
- Building a Data Pipeline with an Open Source Stack☆54Updated 11 months ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆53Updated last year
- ☆16Updated last year
- Execution of DBT models using Apache Airflow through Docker Compose☆117Updated 2 years ago
- Project for "Data pipeline design patterns" blog.☆45Updated 10 months ago
- ☆43Updated 3 years ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago