awslabs / python-deequ
Python API for Deequ
β726Updated 3 weeks ago
Related projects β
Alternatives and complementary repositories for python-deequ
- PySpark test helper methods with beautiful error messagesβ615Updated 2 weeks ago
- pyspark methods to enhance developer productivity π£ π― πβ640Updated 3 weeks ago
- Port(ish) of Great Expectations to dbt test macrosβ1,077Updated last month
- Apache Airflow integration for dbtβ396Updated 5 months ago
- Data quality testing for the modern data stack (SQL, Spark, and Pandas) https://www.soda.ioβ1,901Updated this week
- Delta Lake helper methods in PySparkβ304Updated 2 months ago
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricksβ400Updated this week
- The athena adapter plugin for dbt (https://getdbt.com)β223Updated last week
- Run your dbt Core projects as Apache Airflow DAGs and Task Groups with a few lines of codeβ652Updated this week
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.β166Updated last year
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.β347Updated this week
- Deequ is a library built on top of Apache Spark for defining "unit tests for data", which measure data quality in large datasets.β3,307Updated last month
- This is a guide to PySpark code style presenting common situations and the associated best practices based on the most frequent recurringβ¦β1,049Updated last month
- Great Expectations Airflow operatorβ159Updated last week
- Template for a data contract used in a data mesh.β462Updated 7 months ago
- Data pipeline with dbt, Airflow, Great Expectationsβ158Updated 3 years ago
- The athena adapter plugin for dbt (https://getdbt.com)β141Updated last year
- A Python Library to support running data quality rules while the spark job is runningβ‘β162Updated this week
- Generate and Visualize Data Lineage from query historyβ311Updated last year
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflowβ187Updated last week
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for severβ¦β221Updated last week
- dbt package that is part of Elementary, the dbt-native data observability solution for data & analytics engineers. Monitor your data pipeβ¦β386Updated this week
- This dbt package contains macros to support unit testing that can be (re)used across dbt projects.β421Updated 3 months ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.β178Updated 4 months ago
- Home of the Open Data Contract Standard (ODCS).β384Updated this week