martinkarlssonio / big-data-solutionLinks
☆46Updated 2 years ago
Alternatives and similar repositories for big-data-solution
Users that are interested in big-data-solution are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆262Updated 2 years ago
- Multi-container environment with Hadoop, Spark and Hive☆226Updated 7 months ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- O'Reilly Book: [Data Algorithms with Spark] by Mahmoud Parsian☆225Updated 2 years ago
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆500Updated last month
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆481Updated last year
- ☆92Updated 10 months ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 5 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆64Updated 2 years ago
- Simple stream processing pipeline☆110Updated last year
- Near real time ETL to populate a dashboard.☆73Updated 3 months ago
- Tutorial for setting up a Spark cluster running inside of Docker containers located on different machines☆134Updated 3 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆55Updated 2 years ago
- Apche Spark Structured Streaming with Kafka using Python(PySpark)☆40Updated 6 years ago
- Hadoop-Hive-Spark cluster + Jupyter on Docker☆80Updated 11 months ago
- Python data repo, jupyter notebook, python scripts and data.☆543Updated last year
- ETL pipeline using pyspark (Spark - Python)☆116Updated 5 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆95Updated 6 years ago
- Data Engineering on GCP☆39Updated 3 years ago
- Apache Spark 3 - Structured Streaming Course Material☆125Updated 2 years ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆158Updated 5 years ago
- Data Engineering with Spark and Delta Lake☆105Updated 2 years ago
- Source code of the Apache Airflow Tutorial for Beginners on YouTube Channel Coder2j (https://www.youtube.com/c/coder2j)☆329Updated last year
- End to end data engineering project☆57Updated 3 years ago
- Delta Lake examples☆234Updated last year
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆37Updated 2 years ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆75Updated 2 years ago
- ☆44Updated last year
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆88Updated 6 years ago