ismaildawoodjee / GreatExLinks
A project for exploring how Great Expectations can be used to ensure data quality and validate batches within a data pipeline defined in Airflow.
☆25Updated 3 years ago
Alternatives and similar repositories for GreatEx
Users that are interested in GreatEx are comparing it to the libraries listed below
Sorting:
- Source code for the MC technical blog post "Data Observability in Practice Using SQL"☆40Updated last year
- A tutorial for the Great Expectations library.☆72Updated 5 years ago
- ☆23Updated 6 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- Full stack data engineering tools and infrastructure set-up☆57Updated 4 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆168Updated 2 years ago
- Containerized end-to-end analytics of Spotify data using Python, dbt, Postgres, and Metabase☆132Updated 3 years ago
- Execution of DBT models using Apache Airflow through Docker Compose☆126Updated 3 years ago
- ☆80Updated last year
- Data-aware orchestration with dagster, dbt, and airbyte☆31Updated 3 years ago
- Code for dbt tutorial☆168Updated 5 months ago
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆91Updated 2 years ago
- ☆60Updated last year
- Data pipeline with dbt, Airflow, Great Expectations☆166Updated 4 years ago
- An example of a Dagster project with a possible folder structure to organize the assets, jobs, repositories, schedules, and ops. Also has…☆102Updated last year
- A proof of concept for how to set up a codebase for an analytics org.☆14Updated 4 years ago
- Great Expectations Airflow operator☆170Updated last week
- ☆158Updated 3 weeks ago
- Delta Lake Documentation☆53Updated last year
- Template for Data Engineering and Data Pipeline projects☆116Updated 3 years ago
- ☆38Updated 5 years ago
- Sample configuration to deploy a modern data platform.☆89Updated 4 years ago
- The Picnic Data Vault framework.☆129Updated 3 weeks ago
- ☆26Updated 2 years ago
- Make dbt docs and Apache Superset talk to one another☆155Updated 4 months ago
- A tool to generate PySpark schema from JSON.☆28Updated 2 years ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆51Updated 2 years ago
- A repository of sample code to accompany our blog post on Airflow and dbt.☆183Updated 2 years ago
- ☆40Updated 11 months ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆56Updated 2 years ago