data-engineering-helpers / data-contractsLinks
Food for thoughts around data contracts
☆26Updated last month
Alternatives and similar repositories for data-contracts
Users that are interested in data-contracts are comparing it to the libraries listed below
Sorting:
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆219Updated last month
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆179Updated last year
- Template for a data contract used in a data mesh.☆476Updated last year
- Data product portal created by Dataminded☆190Updated this week
- Home of the Open Data Contract Standard (ODCS).☆552Updated this week
- Delta Lake helper methods in PySpark☆325Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆266Updated 2 weeks ago
- A Python Library to support running data quality rules while the spark job is running⚡☆188Updated this week
- Data pipeline with dbt, Airflow, Great Expectations☆163Updated 4 years ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆220Updated 4 months ago
- Code snippets for Data Engineering Design Patterns book☆191Updated 6 months ago
- Possibly the fastest DataFrame-agnostic quality check library in town.☆203Updated last week
- Sample configuration to deploy a modern data platform.☆88Updated 3 years ago
- Python API for Deequ☆796Updated 5 months ago
- A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆53Updated 10 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated 2 weeks ago
- Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆115Updated 5 months ago
- PySpark test helper methods with beautiful error messages☆714Updated last month
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆169Updated last year
- Great Expectations Airflow operator☆167Updated last week
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆64Updated 4 months ago
- The Data Contract Specification Repository☆373Updated 3 weeks ago
- ☆157Updated 3 weeks ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆375Updated 4 months ago
- ☆243Updated last week
- Airbyte made simple (no UI, no database, no cluster)☆183Updated 3 months ago
- Delta Lake examples☆227Updated 11 months ago
- Ingesting data with Pulumi, AWS lambdas and Snowflake in a scalable, fully replayable manner☆71Updated 3 years ago
- A SQL port of python's scikit-learn preprocessing module, provided as cross-database dbt macros.☆185Updated 2 years ago
- Enforce Data Contracts☆681Updated 2 weeks ago