Anant / example-airflow-and-sparkLinks
☆12Updated 3 years ago
Alternatives and similar repositories for example-airflow-and-spark
Users that are interested in example-airflow-and-spark are comparing it to the libraries listed below
Sorting:
- A data pipeline with Kafka, Spark Streaming, dbt, Docker, Airflow, and GCP!☆12Updated 2 years ago
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆48Updated 2 years ago
- Learn Apache Spark in Scala, Python (PySpark) and R (SparkR) by building your own cluster with a JupyterLab interface on Docker.☆495Updated 2 years ago
- This project is for demonstrating knowledge of Data Engineering tools and concepts and also learning in the process☆47Updated 2 years ago
- ☆88Updated 3 years ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆277Updated last year
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆136Updated 3 months ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- Near real time ETL to populate a dashboard.☆72Updated 2 weeks ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆55Updated last year
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- Apache Spark 3 - Structured Streaming Course Material☆123Updated 2 years ago
- Tutorial for setting up a Spark cluster running inside of Docker containers located on different machines☆134Updated 2 years ago
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆478Updated 11 months ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆102Updated 5 months ago
- A list of all my posts and personal projects☆74Updated last year
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆34Updated 4 years ago
- ☆90Updated 7 months ago
- The goal of this project is to build a docker cluster that gives access to Hadoop, HDFS, Hive, PySpark, Sqoop, Airflow, Kafka, Flume, Pos…☆66Updated 2 years ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆155Updated 5 years ago
- Apche Spark Structured Streaming with Kafka using Python(PySpark)☆40Updated 6 years ago
- ☆40Updated 2 years ago
- Python data repo, jupyter notebook, python scripts and data.☆528Updated 9 months ago
- This repo contains "Databricks Certified Data Engineer Professional" Questions and related docs.☆111Updated last year
- Price Crawler - Tracking Price Inflation☆187Updated 5 years ago
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆86Updated 6 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆92Updated 6 years ago
- Udacity Data Engineering Nanodegree Program☆52Updated 4 years ago
- Local Environment to Practice Data Engineering☆141Updated 8 months ago