mrn-aglic / spark-standalone-clusterLinks
This repo contains a spark standalone cluster on docker for anyone who wants to play with PySpark by submitting their applications.
☆35Updated 2 years ago
Alternatives and similar repositories for spark-standalone-cluster
Users that are interested in spark-standalone-cluster are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- Code for dbt tutorial☆161Updated last week
- Project for "Data pipeline design patterns" blog.☆45Updated last year
- Sample project to demonstrate data engineering best practices☆196Updated last year
- Simple stream processing pipeline☆108Updated last year
- Local Environment to Practice Data Engineering☆143Updated 8 months ago
- ☆89Updated 7 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆74Updated 2 years ago
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated last month
- Delta-Lake, ETL, Spark, Airflow☆48Updated 2 years ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆276Updated last year
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆495Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆182Updated 5 months ago
- Delta Lake examples☆227Updated 11 months ago
- Code for "Efficient Data Processing in Spark" Course☆338Updated 3 months ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- Execution of DBT models using Apache Airflow through Docker Compose☆118Updated 2 years ago
- Near real time ETL to populate a dashboard.☆72Updated last week
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆78Updated 2 years ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆34Updated 4 years ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆266Updated last week
- End to end data engineering project☆57Updated 2 years ago
- ☆14Updated 2 years ago
- ☆40Updated 2 years ago
- Delta Lake helper methods in PySpark☆325Updated last year
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆60Updated last year
- ☆156Updated 3 weeks ago
- Open Source LeetCode for PySpark, Spark, Pandas and DBT/Snowflake☆208Updated 2 months ago
- This repo contains "Databricks Certified Data Engineer Professional" Questions and related docs.☆109Updated last year