ismaildawoodjee / GreatExLinks
A project for exploring how Great Expectations can be used to ensure data quality and validate batches within a data pipeline defined in Airflow.
☆22Updated 2 years ago
Alternatives and similar repositories for GreatEx
Users that are interested in GreatEx are comparing it to the libraries listed below
Sorting:
- A repository of sample code to show data quality checking best practices using Airflow.☆77Updated 2 years ago
- Full stack data engineering tools and infrastructure set-up☆53Updated 4 years ago
- ☆22Updated 4 years ago
- Containerized end-to-end analytics of Spotify data using Python, dbt, Postgres, and Metabase☆128Updated 2 years ago
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆84Updated 2 years ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 10 months ago
- ☆18Updated last year
- Source code for the MC technical blog post "Data Observability in Practice Using SQL"☆38Updated 11 months ago
- Cost Efficient Data Pipelines with DuckDB☆54Updated last month
- Trino dbt demo project to mix and load BigQuery data with and in a local PostgreSQL database☆75Updated 3 years ago
- ☆80Updated 8 months ago
- Yet Another (Spark) ETL Framework☆21Updated last year
- Delta-Lake, ETL, Spark, Airflow☆47Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆122Updated 3 months ago
- An example dbt project using AutomateDV to create a Data Vault 2.0 Data Warehouse based on the Snowflake TPC-H dataset.☆50Updated last year
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆55Updated 2 years ago
- Execution of DBT models using Apache Airflow through Docker Compose☆116Updated 2 years ago
- Example repo to create end to end tests for data pipeline.☆25Updated last year
- Delta Lake Documentation☆48Updated last year
- Data-aware orchestration with dagster, dbt, and airbyte☆30Updated 2 years ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆62Updated last month
- A simple and easy to use Data Quality (DQ) tool built with Python.☆50Updated last year
- Delta Lake helper methods. No Spark dependency.☆23Updated 9 months ago
- Repo for orienting dbt users to the Dagster asset framework☆54Updated 2 years ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆72Updated last year
- End to end data engineering project☆57Updated 2 years ago
- dbt Cloud pipelines in airflow examples☆35Updated last year
- Repo for CDC with debezium blog post☆28Updated 9 months ago
- ☆16Updated last year
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆35Updated last year