sdesilva26 / docker-spark
Tutorial for setting up a Spark cluster running inside of Docker containers located on different machines
☆123Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for docker-spark
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆39Updated 3 years ago
- Docker with Airflow and Spark standalone cluster☆245Updated last year
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆462Updated last year
- Delta Lake examples☆207Updated last month
- Delta Lake helper methods in PySpark☆304Updated 2 months ago
- Spark style guide☆256Updated last month
- Multi-container environment with Hadoop, Spark and Hive☆203Updated 10 months ago
- Delta-Lake, ETL, Spark, Airflow☆44Updated 2 years ago
- A simple spark standalone cluster for your testing environment purposses☆557Updated 8 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆163Updated last week
- Pyspark boilerplate for running prod ready data pipeline☆28Updated 3 years ago
- Databricks - Apache Spark™ - 2X Certified Developer☆264Updated 4 years ago
- Example for article Running Spark 3 with standalone Hive Metastore 3.0☆96Updated last year