ananthdurai / airflow-trainingLinks
Airflow training for the crunch conf
☆104Updated 7 years ago
Alternatives and similar repositories for airflow-training
Users that are interested in airflow-training are comparing it to the libraries listed below
Sorting:
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆169Updated 2 years ago
- (project & tutorial) dag pipeline tests + ci/cd setup☆89Updated 4 years ago
- ☆202Updated 2 years ago
- Airflow basics tutorial☆397Updated 4 years ago
- Example DAGs using hooks and operators from Airflow Plugins☆347Updated 7 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- ☆179Updated 2 years ago
- Airflow Unit Tests and Integration Tests☆261Updated 3 years ago
- Developed a data pipeline to automate data warehouse ETL by building custom airflow operators that handle the extraction, transformation,…☆90Updated 3 years ago
- Public source code for the Udemy online course Apache Airflow: Complete Hands-On Beginner to Advanced Class.☆63Updated 5 years ago
- How to build an awesome data engineering team☆100Updated 6 years ago
- Great Expectations Airflow operator☆169Updated last week
- A complete development environment setup for working with Airflow☆128Updated 2 years ago
- ☆92Updated 2 years ago
- Repository used for Spark Trainings☆54Updated 2 years ago
- Repository of sample Databricks notebooks☆271Updated last year
- Data pipeline with dbt, Airflow, Great Expectations☆164Updated 4 years ago
- PySpark data-pipeline testing and CICD☆28Updated 5 years ago
- Apache Airflow in Docker Compose (for both versions 1.10.* and 2.*)☆186Updated last year
- scaffold of Apache Airflow executing Docker containers☆86Updated 2 years ago
- An Airflow docker image preconfigured to work well with Spark and Hadoop/EMR☆175Updated 5 months ago
- Spark style guide☆264Updated last year
- A boilerplate for writing PySpark Jobs☆394Updated last year
- Data validation library for PySpark 3.0.0☆33Updated 3 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆55Updated 2 years ago
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes☆64Updated 3 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆94Updated 6 years ago
- A full data warehouse infrastructure with ETL pipelines running inside docker on Apache Airflow for data orchestration, AWS Redshift for …☆139Updated 5 years ago
- Delta Lake examples☆231Updated last year
- Astronomer Core Docker Images☆106Updated last year