martinkarlssonio / big-data-solution
☆43Updated last year
Related projects ⓘ
Alternatives and complementary repositories for big-data-solution
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆21Updated 2 years ago
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆37Updated last year
- Delta-Lake, ETL, Spark, Airflow☆44Updated 2 years ago
- Docker with Airflow and Spark standalone cluster☆244Updated last year
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆128Updated last year
- ☆31Updated last year
- Hadoop-Hive-Spark cluster + Jupyter on Docker☆61Updated 5 months ago
- The goal of this project is to build a docker cluster that gives access to Hadoop, HDFS, Hive, PySpark, Sqoop, Airflow, Kafka, Flume, Pos…☆54Updated last year
- ☆37Updated 4 months ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆42Updated last year
- A real-time streaming ETL pipeline for streaming and performing sentiment analysis on Twitter data using Apache Kafka, Apache Spark and D…☆29Updated 4 years ago
- Multi-container environment with Hadoop, Spark and Hive☆202Updated 10 months ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆99Updated 3 years ago
- Classwork projects and home works done through Udacity data engineering nano degree☆74Updated 11 months ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆39Updated 3 years ago
- This repo contains a spark standalone cluster on docker for anyone who wants to play with PySpark by submitting their applications.☆23Updated last year
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆57Updated last year
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆80Updated 5 years ago
- This project shows how to capture changes from postgres database and stream them into kafka☆31Updated 5 months ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆65Updated 3 months ago
- 😈Complete End to End ETL Pipeline with Spark, Airflow, & AWS☆43Updated 5 years ago
- ☆13Updated last year
- ☆86Updated 2 years ago
- Apache Spark 3 - Structured Streaming Course Material☆119Updated last year
- PySpark-ETL☆23Updated 4 years ago
- Spark, Airflow, Kafka☆27Updated last year
- Data Engineering on GCP☆30Updated 2 years ago
- Developed an ETL pipeline for a Data Lake that extracts data from S3, processes the data using Spark, and loads the data back into S3 as …☆16Updated 5 years ago
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆84Updated 3 weeks ago
- ☆68Updated 5 months ago