cluster-apps-on-docker / spark-standalone-cluster-on-dockerLinks
Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.
☆494Updated 2 years ago
Alternatives and similar repositories for spark-standalone-cluster-on-docker
Users that are interested in spark-standalone-cluster-on-docker are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆258Updated last year
- Tutorial for setting up a Spark cluster running inside of Docker containers located on different machines☆133Updated 2 years ago
- A simple spark standalone cluster for your testing environment purposses☆573Updated last year
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆33Updated 4 years ago
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆465Updated 8 months ago
- Spark style guide☆259Updated 8 months ago
- PySpark test helper methods with beautiful error messages☆699Updated 2 weeks ago
- pyspark methods to enhance developer productivity 📣 👯 🎉☆672Updated 3 months ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- This project provides Apache Spark SQL, RDD, DataFrame and Dataset examples in Scala language☆564Updated last year
- Multi-container environment with Hadoop, Spark and Hive☆215Updated last month
- Delta Lake examples☆225Updated 8 months ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- ☆265Updated 8 months ago
- Example of how to leverage Apache Spark distributed capabilities to call REST-API using a UDF☆51Updated 2 years ago
- Delta Lake helper methods in PySpark☆326Updated 9 months ago
- Apache Spark 3 - Structured Streaming Course Material☆121Updated last year
- ☆21Updated 3 months ago
- ☆26Updated last year
- BigQuery data source for Apache Spark: Read data from BigQuery into DataFrames, write DataFrames into BigQuery tables.☆400Updated this week
- A simplified, lightweight ETL Framework based on Apache Spark☆586Updated last year
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricks☆433Updated 4 months ago
- Apache Airflow integration for dbt☆405Updated last year
- Hadoop-Hive-Spark cluster + Jupyter on Docker☆75Updated 5 months ago
- Code samples, etc. for Databricks☆64Updated 3 weeks ago
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆117Updated this week
- A Python Library to support running data quality rules while the spark job is running⚡☆188Updated this week
- This repo contains a spark standalone cluster on docker for anyone who wants to play with PySpark by submitting their applications.☆35Updated 2 years ago
- Local Environment to Practice Data Engineering☆142Updated 5 months ago
- Code for Data Pipelines with Apache Airflow☆779Updated 10 months ago