awslabs / python-deequ
Python API for Deequ
β738Updated 3 months ago
Alternatives and similar repositories for python-deequ:
Users that are interested in python-deequ are comparing it to the libraries listed below
- PySpark test helper methods with beautiful error messagesβ648Updated this week
- pyspark methods to enhance developer productivity π£ π― πβ657Updated last month
- Apache Airflow integration for dbtβ400Updated 8 months ago
- Template for a data contract used in a data mesh.β467Updated 10 months ago
- Delta Lake helper methods in PySparkβ312Updated 4 months ago
- Port(ish) of Great Expectations to dbt test macrosβ1,127Updated last month
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.β360Updated this week
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricksβ413Updated this week
- Data quality testing for the modern data stack (SQL, Spark, and Pandas) https://www.soda.ioβ1,973Updated this week
- Great Expectations Airflow operatorβ160Updated 2 months ago
- A Python Library to support running data quality rules while the spark job is runningβ‘β167Updated last week
- This is a guide to PySpark code style presenting common situations and the associated best practices based on the most frequent recurringβ¦β1,094Updated 3 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for severβ¦β229Updated 2 months ago
- Macros that generate dbt codeβ514Updated last month
- The athena adapter plugin for dbt (https://getdbt.com)β237Updated this week
- The athena adapter plugin for dbt (https://getdbt.com)β141Updated last year
- Run your dbt Core projects as Apache Airflow DAGs and Task Groups with a few lines of codeβ818Updated this week
- Data pipeline with dbt, Airflow, Great Expectationsβ160Updated 3 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.β166Updated last year
- Deequ is a library built on top of Apache Spark for defining "unit tests for data", which measure data quality in large datasets.β3,348Updated this week
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflowβ192Updated 3 weeks ago
- Turning PySpark Into a Universal DataFrame APIβ349Updated this week
- Dynamically generate Apache Airflow DAGs from YAML configuration filesβ1,231Updated this week
- BigQuery data source for Apache Spark: Read data from BigQuery into DataFrames, write DataFrames into BigQuery tables.β381Updated 3 weeks ago