ayyoubmaul / hadoop-dockerLinks
☆21Updated 8 months ago
Alternatives and similar repositories for hadoop-docker
Users that are interested in hadoop-docker are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆262Updated 2 years ago
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆500Updated 3 weeks ago
- ☆40Updated 2 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- ☆92Updated 9 months ago
- Near real time ETL to populate a dashboard.☆73Updated 2 months ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆34Updated 5 years ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- Code for Data Pipelines with Apache Airflow☆808Updated last year
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆63Updated 2 years ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆281Updated last year
- This repo contains a spark standalone cluster on docker for anyone who wants to play with PySpark by submitting their applications.☆37Updated 2 years ago
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆480Updated last year
- Multi-container environment with Hadoop, Spark and Hive☆226Updated 6 months ago
- Code snippets for Data Engineering Design Patterns book☆275Updated 8 months ago
- Code for dbt tutorial☆165Updated 2 months ago
- Sample project to demonstrate data engineering best practices☆200Updated last year
- Building a Modern Data Lake with Minio, Spark, Airflow via Docker.☆22Updated last year
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆64Updated 2 years ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 5 months ago
- Tutorial for setting up a Spark cluster running inside of Docker containers located on different machines☆134Updated 3 years ago
- Delta Lake examples☆233Updated last year
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- Course notes for the Astronomer Certification DAG Authoring for Apache Airflow☆56Updated last year
- trino + hive + minio with postgres in docker compose☆27Updated 2 years ago
- Simple stream processing pipeline☆110Updated last year
- Local Environment to Practice Data Engineering☆143Updated 10 months ago
- An end-to-end data engineering pipeline that orchestrates data ingestion, processing, and storage using Apache Airflow, Python, Apache Ka…☆290Updated 9 months ago
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆153Updated last month