bitol-io / open-data-contract-standard
Home of the Open Data Contract Standard (ODCS).
☆392Updated last week
Related projects ⓘ
Alternatives and complementary repositories for open-data-contract-standard
- CLI to manage your datacontract.yaml files☆475Updated this week
- Template for a data contract used in a data mesh.☆464Updated 8 months ago
- The Data Contract Specification Repository☆271Updated 3 weeks ago
- Data product portal created by Dataminded☆146Updated this week
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricks☆405Updated last week
- This package contains macros and models to find DAG issues automatically☆451Updated last week
- dbt package that is part of Elementary, the dbt-native data observability solution for data & analytics engineers. Monitor your data pipe…☆393Updated this week
- Delta Lake helper methods in PySpark☆304Updated 2 months ago
- List of `pre-commit` hooks to ensure the quality of your `dbt` projects.☆599Updated last week
- A free to use dbt package for creating and loading Data Vault 2.0 compliant Data Warehouses (powered by dbt, an open source data engineer…☆510Updated 3 months ago
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆166Updated 3 months ago
- A dbt package for modelling dbt metadata. https://brooklyn-data.github.io/dbt_artifacts☆331Updated 2 weeks ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆181Updated 4 months ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆350Updated this week
- Macros that generate dbt code☆492Updated last month
- Scalefree's dbt package for a Data Vault 2.0 implementation congruent to the original Data Vault 2.0 definition by Dan Linstedt including…☆141Updated this week
- Data pipeline with dbt, Airflow, Great Expectations☆158Updated 3 years ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆189Updated last week
- A Python Library to support running data quality rules while the spark job is running⚡☆163Updated last week
- This dbt package contains macros to support unit testing that can be (re)used across dbt projects.☆423Updated 3 months ago
- A dbt package from SELECT to help you monitor Snowflake performance and costs☆217Updated 2 months ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆195Updated this week
- dbt-snowflake contains all of the code enabling dbt to work with Snowflake☆295Updated this week
- Dagster Labs' open-source data platform, built with Dagster.☆284Updated this week
- A repository of sample code to accompany our blog post on Airflow and dbt.☆167Updated last year
- dbt macros to stage external sources☆312Updated last week
- Schema modelling framework for decentralised domain-driven ownership of data.☆247Updated 11 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆223Updated 3 weeks ago