ayyoubmaul / hadoop-dockerLinks
☆21Updated 10 months ago
Alternatives and similar repositories for hadoop-docker
Users that are interested in hadoop-docker are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆262Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆34Updated 5 years ago
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆505Updated 2 months ago
- Code for Data Pipelines with Apache Airflow☆811Updated last year
- Building a Data Pipeline with an Open Source Stack☆55Updated 6 months ago
- Code for dbt tutorial☆166Updated 4 months ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆280Updated last year
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆64Updated 2 years ago
- Local Environment to Practice Data Engineering☆143Updated last year
- ☆50Updated this week
- ☆43Updated 3 years ago
- Resources for video demonstrations and blog posts related to DataOps on AWS☆182Updated 3 years ago
- Code snippets for Data Engineering Design Patterns book☆307Updated last week
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- trino + hive + minio with postgres in docker compose☆27Updated 2 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆65Updated 2 years ago
- Data pipeline with dbt, Airflow, Great Expectations☆166Updated 4 years ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- Docker Airflow - Contains a docker compose file for Airflow 2.0☆70Updated 3 years ago
- ☆92Updated 11 months ago
- Sample project to demonstrate data engineering best practices☆203Updated last year
- Simple stream processing pipeline☆110Updated last year
- ☆40Updated 2 years ago
- (project & tutorial) dag pipeline tests + ci/cd setup☆89Updated 4 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆144Updated 2 years ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆56Updated 2 years ago
- The easiest way to run Airflow locally, with linting & tests for valid DAGs and Plugins.☆257Updated 4 years ago
- Apache Airflow integration for dbt☆412Updated last year