martinkarlssonio / big-data-solutionLinks
☆47Updated 2 years ago
Alternatives and similar repositories for big-data-solution
Users that are interested in big-data-solution are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- Multi-container environment with Hadoop, Spark and Hive☆218Updated 3 months ago
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆141Updated 2 years ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆63Updated 2 years ago
- ☆90Updated 6 months ago
- Apache Spark 3 - Structured Streaming Course Material☆121Updated last year
- Data Engineering on GCP☆36Updated 2 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆105Updated 4 years ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆55Updated last year
- Simple stream processing pipeline☆103Updated last year
- ☆40Updated 2 years ago
- O'Reilly Book: [Data Algorithms with Spark] by Mahmoud Parsian☆219Updated 2 years ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆34Updated 4 years ago
- ☆88Updated 2 years ago
- ☆41Updated last year
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆98Updated 4 months ago
- Near real time ETL to populate a dashboard.☆72Updated last year
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆496Updated 2 years ago
- End to end data engineering project☆57Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆142Updated 4 months ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆149Updated 5 years ago
- An end-to-end data engineering pipeline that orchestrates data ingestion, processing, and storage using Apache Airflow, Python, Apache Ka…☆268Updated 5 months ago
- Data Engineering with Spark and Delta Lake☆102Updated 2 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆56Updated 2 years ago
- Code for dbt tutorial☆159Updated 2 months ago
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆476Updated 9 months ago
- Course notes for the Astronomer Certification DAG Authoring for Apache Airflow☆53Updated last year
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago