martinkarlssonio / big-data-solutionLinks
☆47Updated 2 years ago
Alternatives and similar repositories for big-data-solution
Users that are interested in big-data-solution are comparing it to the libraries listed below
Sorting:
- Multi-container environment with Hadoop, Spark and Hive☆217Updated 2 months ago
- ☆89Updated 5 months ago
- Docker with Airflow and Spark standalone cluster☆261Updated last year
- Creation of a data lakehouse and an ELT pipeline to enable the efficient analysis and use of data☆47Updated last year
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆495Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 4 years ago
- O'Reilly Book: [Data Algorithms with Spark] by Mahmoud Parsian☆216Updated 2 years ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆55Updated last year
- Hadoop-Hive-Spark cluster + Jupyter on Docker☆75Updated 6 months ago
- This project demonstrates how to use Apache Airflow to submit jobs to Apache spark cluster in different programming laguages using Python…☆44Updated last year
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆42Updated last year
- ETL pipeline using pyspark (Spark - Python)☆117Updated 5 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆139Updated last year
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆63Updated last year
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆470Updated 9 months ago
- Python data repo, jupyter notebook, python scripts and data.☆518Updated 7 months ago
- Data Engineering with Spark and Delta Lake☆101Updated 2 years ago
- Data Engineering on GCP☆36Updated 2 years ago
- Simple stream processing pipeline☆103Updated last year
- ☆87Updated 2 years ago
- PySpark Tutorial for Beginners - Practical Examples in Jupyter Notebook with Spark version 3.4.1. The tutorial covers various topics like…☆124Updated last year
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆33Updated 4 years ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 2 weeks ago
- Source code of the Apache Airflow Tutorial for Beginners on YouTube Channel Coder2j (https://www.youtube.com/c/coder2j)☆306Updated last year
- The goal of this project is to build a docker cluster that gives access to Hadoop, HDFS, Hive, PySpark, Sqoop, Airflow, Kafka, Flume, Pos…☆64Updated 2 years ago
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆124Updated 3 weeks ago
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆84Updated 5 years ago
- Code for dbt tutorial☆156Updated last month