Anant / example-airflow-and-sparkLinks
☆12Updated 3 years ago
Alternatives and similar repositories for example-airflow-and-spark
Users that are interested in example-airflow-and-spark are comparing it to the libraries listed below
Sorting:
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- A data pipeline with Kafka, Spark Streaming, dbt, Docker, Airflow, and GCP!☆12Updated 2 years ago
- This repo contains "Databricks Certified Data Engineer Professional" Questions and related docs.☆105Updated last year
- Delta-Lake, ETL, Spark, Airflow☆48Updated 2 years ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- ☆88Updated 2 years ago
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆495Updated 2 years ago
- ☆90Updated 6 months ago
- Apache Spark 3 - Structured Streaming Course Material☆122Updated 2 years ago
- Resources for video demonstrations and blog posts related to DataOps on AWS☆182Updated 3 years ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆274Updated last year
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆130Updated 2 months ago
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆476Updated 10 months ago
- A real-time streaming ETL pipeline for streaming and performing sentiment analysis on Twitter data using Apache Kafka, Apache Spark and D…☆30Updated 5 years ago
- Apche Spark Structured Streaming with Kafka using Python(PySpark)☆40Updated 6 years ago
- A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from loc…☆24Updated 3 years ago
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆86Updated 6 years ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆55Updated last year
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆142Updated 2 years ago
- ETL pipeline using pyspark (Spark - Python)☆116Updated 5 years ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆34Updated 4 years ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆102Updated 5 months ago
- Solution to all projects of Udacity's Data Engineering Nanodegree: Data Modeling with Postgres & Cassandra, Data Warehouse with Redshift,…☆57Updated 2 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆63Updated 2 years ago
- ☆23Updated 4 years ago
- ☆15Updated 5 years ago
- The resources of the preparation course for Databricks Data Engineer Associate certification exam☆471Updated 2 months ago
- Local Environment to Practice Data Engineering☆143Updated 8 months ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆152Updated 5 years ago