RWaltersMA / mongo-spark-jupyterLinks
Docker environment that spins up MongoDB replica set, Spark, and Jupyter Lab. Example code uses PySpark and the MongoDB Spark Connector.
☆40Updated 2 years ago
Alternatives and similar repositories for mongo-spark-jupyter
Users that are interested in mongo-spark-jupyter are comparing it to the libraries listed below
Sorting:
- A Series of Notebooks on how to start with Kafka and Python☆152Updated 6 months ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆63Updated 2 years ago
- (project & tutorial) dag pipeline tests + ci/cd setup☆88Updated 4 years ago
- The Python fake data producer for Apache Kafka® is a complete demo app allowing you to quickly produce JSON fake streaming datasets and …☆86Updated last year
- Repo that relates to the Medium blog 'Keeping your ML model in shape with Kafka, Airflow' and MLFlow'☆121Updated 2 years ago
- Resources for video demonstrations and blog posts related to DataOps on AWS☆182Updated 3 years ago
- Developed a data pipeline to automate data warehouse ETL by building custom airflow operators that handle the extraction, transformation,…☆90Updated 3 years ago
- Public source code for the Batch Processing with Apache Beam (Python) online course☆18Updated 4 years ago
- PySpark Cheatsheet☆36Updated 2 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 4 years ago
- Apache Airflow in Docker Compose (for both versions 1.10.* and 2.*)☆186Updated last year
- Code examples on Apache Spark using python☆107Updated 3 years ago
- ☆151Updated 7 years ago
- ☆88Updated 3 years ago
- Public source code for the Udemy online course Apache Airflow: Complete Hands-On Beginner to Advanced Class.☆63Updated 4 years ago
- Dockerizing an Apache Spark Standalone Cluster☆43Updated 3 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- Docker Airflow - Contains a docker compose file for Airflow 2.0☆68Updated 3 years ago
- Jupyter notebooks for pyspark tutorials given at University☆110Updated 2 months ago
- Fundamentals of Spark with Python (using PySpark), code examples☆352Updated 2 years ago
- Data lake, data warehouse on GCP☆56Updated 3 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆55Updated 2 years ago
- One click deploy docker-compose with Kafka, Spark Streaming, Zeppelin UI and Monitoring (Grafana + Kafka Manager)☆121Updated 4 years ago
- Simple alert system implemented in Kafka and Python☆96Updated 7 years ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- Mastering Big Data Analytics with PySpark, Published by Packt☆161Updated last year
- Apche Spark Structured Streaming with Kafka using Python(PySpark)☆40Updated 6 years ago
- A production-grade data pipeline has been designed to automate the parsing of user search patterns to analyze user engagement. Extract d…☆24Updated 3 years ago
- Project for real-time anomaly detection using Kafka and python☆58Updated 2 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆92Updated 6 years ago