awslabs / python-deequ
Python API for Deequ
β744Updated 4 months ago
Alternatives and similar repositories for python-deequ:
Users that are interested in python-deequ are comparing it to the libraries listed below
- PySpark test helper methods with beautiful error messagesβ663Updated last month
- pyspark methods to enhance developer productivity π£ π― πβ661Updated 2 months ago
- Delta Lake helper methods in PySparkβ315Updated 5 months ago
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricksβ419Updated last week
- Data quality testing for the modern data stack (SQL, Spark, and Pandas) https://www.soda.ioβ2,016Updated this week
- Deequ is a library built on top of Apache Spark for defining "unit tests for data", which measure data quality in large datasets.β3,363Updated last week
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflowβ199Updated last week
- A Python Library to support running data quality rules while the spark job is runningβ‘β172Updated this week
- Template for a data contract used in a data mesh.β467Updated 11 months ago
- Port(ish) of Great Expectations to dbt test macrosβ1,141Updated 2 months ago
- Apache Airflow integration for dbtβ401Updated 9 months ago
- An open protocol for secure data sharingβ805Updated 3 weeks ago
- Great Expectations Airflow operatorβ159Updated this week
- This is a guide to PySpark code style presenting common situations and the associated best practices based on the most frequent recurringβ¦β1,103Updated 5 months ago
- The athena adapter plugin for dbt (https://getdbt.com)β140Updated last year
- Macros that generate dbt codeβ523Updated 3 weeks ago
- Data pipeline with dbt, Airflow, Great Expectationsβ160Updated 3 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.β166Updated last year
- The athena adapter plugin for dbt (https://getdbt.com)β243Updated 2 weeks ago
- Turning PySpark Into a Universal DataFrame APIβ366Updated this week
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for severβ¦β234Updated 2 weeks ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.β362Updated last week
- Guides and docs to help you get up and running with Apache Airflow.β805Updated 2 years ago
- An Open Standard for lineage metadata collectionβ1,848Updated this week
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.β189Updated last week
- dbt package that is part of Elementary, the dbt-native data observability solution for data & analytics engineers. Monitor your data pipeβ¦β410Updated this week
- Dynamically generate Apache Airflow DAGs from YAML configuration filesβ1,247Updated 2 weeks ago
- Generate and Visualize Data Lineage from query historyβ319Updated last year
- List of `pre-commit` hooks to ensure the quality of your `dbt` projects.β625Updated 2 weeks ago
- BigQuery data source for Apache Spark: Read data from BigQuery into DataFrames, write DataFrames into BigQuery tables.β386Updated last week