mpavanetti / airflowLinks
This set of code and instructions has the porpouse to instanciate a compiled environment with set of docker images like airflow webserver, airflow scheduler, postgresql, pyspark, Data Pipeline consuming data from weather api , processing with pyspark and storing in postgresql
☆3Updated last year
Alternatives and similar repositories for airflow
Users that are interested in airflow are comparing it to the libraries listed below
Sorting:
- This repo contains commands that data engineers use in day to day work.☆61Updated 2 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆56Updated 2 years ago
- ☆88Updated 2 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- Data engineering interviews Q&A for data community by data community☆64Updated 5 years ago
- Developed a data pipeline to automate data warehouse ETL by building custom airflow operators that handle the extraction, transformation,…☆90Updated 3 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 4 years ago
- Developed an ETL pipeline for a Data Lake that extracts data from S3, processes the data using Spark, and loads the data back into S3 as …☆16Updated 5 years ago
- The goal of this project is to offer an AWS EMR template using Spot Fleet and On-Demand Instances that you can use quickly. Just focus on…☆28Updated 3 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆92Updated 6 years ago
- (project & tutorial) dag pipeline tests + ci/cd setup☆88Updated 4 years ago
- Airflow training for the crunch conf☆105Updated 6 years ago
- ETL pipeline using pyspark (Spark - Python)☆117Updated 5 years ago
- Near real time ETL to populate a dashboard.☆72Updated last year
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- The goal of this project is to track the expenses of Uber Rides and Uber Eats through data Engineering processes using technologies such …☆121Updated 3 years ago
- pyspark dataframe made easy☆16Updated 3 years ago
- Data lake, data warehouse on GCP☆56Updated 3 years ago
- A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from loc…☆24Updated 3 years ago
- Data Engineering on GCP☆36Updated 2 years ago
- Example repo to create end to end tests for data pipeline.☆25Updated last year
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆84Updated 6 years ago
- This repo contains DAGs demonstrating a variety of ELT patterns using Airflow along with dbt.☆13Updated 2 years ago
- Project for "Data pipeline design patterns" blog.☆45Updated last year
- PySpark Cheatsheet☆36Updated 2 years ago
- Airflow Tutorials☆25Updated 4 years ago
- Code snippets for Data Engineering Design Patterns book☆142Updated 4 months ago
- End to end data engineering project☆57Updated 2 years ago
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- Solution to all projects of Udacity's Data Engineering Nanodegree: Data Modeling with Postgres & Cassandra, Data Warehouse with Redshift,…☆57Updated 2 years ago