data-engineering-helpers / data-contractsLinks
Food for thoughts around data contracts
☆26Updated 3 months ago
Alternatives and similar repositories for data-contracts
Users that are interested in data-contracts are comparing it to the libraries listed below
Sorting:
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆180Updated last year
 - Data product portal created by Dataminded☆193Updated this week
 - Delta Lake helper methods in PySpark☆323Updated last year
 - Template for a data contract used in a data mesh.☆479Updated last year
 - Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆221Updated 3 weeks ago
 - A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆223Updated 6 months ago
 - Home of the Open Data Contract Standard (ODCS).☆574Updated 2 weeks ago
 - Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆119Updated 7 months ago
 - A SQL port of python's scikit-learn preprocessing module, provided as cross-database dbt macros.☆186Updated 2 years ago
 - Data pipeline with dbt, Airflow, Great Expectations☆164Updated 4 years ago
 - Ingesting data with Pulumi, AWS lambdas and Snowflake in a scalable, fully replayable manner☆71Updated 3 years ago
 - The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆270Updated 3 weeks ago
 - Possibly the fastest DataFrame-agnostic quality check library in town.☆223Updated last week
 - ☆282Updated this week
 - A Python Library to support running data quality rules while the spark job is running⚡☆190Updated this week
 - A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆54Updated 3 weeks ago
 - Code snippets for Data Engineering Design Patterns book☆256Updated 7 months ago
 - A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆208Updated this week
 - Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆169Updated 2 years ago
 - Sample configuration to deploy a modern data platform.☆88Updated 3 years ago
 - A Python package that creates fine-grained dbt tasks on Apache Airflow☆74Updated 2 weeks ago
 - This repository provides various demos/examples of using Snowpark for Python.☆284Updated last year
 - Schema modelling framework for decentralised domain-driven ownership of data.☆259Updated last year
 - A Python package to help Databricks Unity Catalog users to read and query Delta Lake tables with Polars, DuckDb, or PyArrow.☆26Updated last year
 - Code for dbt tutorial☆162Updated last month
 - ☆60Updated this week
 - Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆64Updated 5 months ago
 - Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆375Updated 5 months ago
 - Delta Lake examples☆230Updated last year
 - Airbyte made simple (no UI, no database, no cluster)☆184Updated 4 months ago