ismaildawoodjee / aws-data-pipelineLinks
A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from locally hosted Airflow containers. The end product is a Superset dashboard and a Postgres database, hosted on an EC2 instance at this address (powered down):
☆23Updated 3 years ago
Alternatives and similar repositories for aws-data-pipeline
Users that are interested in aws-data-pipeline are comparing it to the libraries listed below
Sorting:
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 4 years ago
- ☆88Updated 3 years ago
- Simple ETL pipeline using Python☆27Updated 2 years ago
- ☆35Updated 2 years ago
- ☆68Updated last week
- Classwork projects and home works done through Udacity data engineering nano degree☆74Updated last year
- This repository will help you to learn about databricks concept with the help of examples. It will include all the important topics which…☆103Updated last week
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆87Updated 6 years ago
- End to end data engineering project☆57Updated 2 years ago
- YouTube tutorial project☆105Updated last year
- PySpark Cheatsheet☆36Updated 2 years ago
- The goal of this project is to track the expenses of Uber Rides and Uber Eats through data Engineering processes using technologies such …☆121Updated 3 years ago
- Data Engineering Capstone Project: ETL Pipelines and Data Warehouse Development☆21Updated 6 years ago
- Ultimate guide for mastering Spark Performance Tuning and Optimization concepts and for preparing for Data Engineering interviews☆168Updated 3 weeks ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆55Updated 2 years ago
- Data Engineering with Google Cloud Platform, published by Packt☆119Updated 2 years ago
- ☆21Updated 2 years ago
- ☆44Updated last year
- 😈Complete End to End ETL Pipeline with Spark, Airflow, & AWS☆50Updated 6 years ago
- Code for "Advanced data transformations in SQL" free live workshop☆84Updated 5 months ago
- A tutorial for the Great Expectations library.☆73Updated 4 years ago
- ☆29Updated last year
- Sample project to demonstrate data engineering best practices☆198Updated last year
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆56Updated 2 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆153Updated 5 years ago
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆41Updated last year
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆102Updated 6 months ago
- Example repo to create end to end tests for data pipeline.☆25Updated last year
- Price Crawler - Tracking Price Inflation☆188Updated 5 years ago