mpavanetti / airflow
This set of code and instructions has the porpouse to instanciate a compiled environment with set of docker images like airflow webserver, airflow scheduler, postgresql, pyspark, Data Pipeline consuming data from weather api , processing with pyspark and storing in postgresql
☆3Updated last year
Alternatives and similar repositories for airflow:
Users that are interested in airflow are comparing it to the libraries listed below
- The goal of this project is to offer an AWS EMR template using Spot Fleet and On-Demand Instances that you can use quickly. Just focus on…☆27Updated 2 years ago
- A production-grade data pipeline has been designed to automate the parsing of user search patterns to analyze user engagement. Extract d…☆24Updated 3 years ago
- ☆87Updated 2 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆53Updated last year
- Developed a data pipeline to automate data warehouse ETL by building custom airflow operators that handle the extraction, transformation,…☆90Updated 3 years ago
- RedditR for Content Engagement and Recommendation☆21Updated 7 years ago
- Developed an ETL pipeline for a Data Lake that extracts data from S3, processes the data using Spark, and loads the data back into S3 as …☆16Updated 5 years ago
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆82Updated 5 years ago
- This repo contains commands that data engineers use in day to day work.☆60Updated 2 years ago
- Near real time ETL to populate a dashboard.☆73Updated 9 months ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆33Updated 4 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆82Updated 5 years ago
- Example repo to create end to end tests for data pipeline.☆23Updated 9 months ago
- A collection of data engineering projects: data modeling, ETL pipelines, data lakes, infrastructure configuration on AWS, data warehousin…☆15Updated 3 years ago
- Code snippets and tools published on the blog at lifearounddata.com☆12Updated 5 years ago
- Simple ETL pipeline using Python☆25Updated last year
- pyspark dataframe made easy☆16Updated 3 years ago
- PySpark Cheatsheet☆36Updated 2 years ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆142Updated 4 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆100Updated 4 years ago
- The goal of this project is to track the expenses of Uber Rides and Uber Eats through data Engineering processes using technologies such …☆120Updated 2 years ago
- Simplify Big Data Analytics with Amazon EMR, published by Packt☆13Updated 2 years ago
- Solution to all projects of Udacity's Data Engineering Nanodegree: Data Modeling with Postgres & Cassandra, Data Warehouse with Redshift,…☆56Updated 2 years ago
- Dockerizing and Consuming an Apache Livy environment☆11Updated 2 years ago
- Data engineering interviews Q&A for data community by data community☆63Updated 4 years ago
- A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from loc…☆21Updated 2 years ago
- Pyspark Spotify ETL☆17Updated 3 years ago
- A course by DataTalks Club that covers Spark, Kafka, Docker, Airflow, Terraform, DBT, Big Query etc☆14Updated 3 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆75Updated 2 years ago
- End-to-end data platform leveraging the Modern data stack☆47Updated 11 months ago