data-engineering-helpers / data-contractsLinks
Food for thoughts around data contracts
☆26Updated last month
Alternatives and similar repositories for data-contracts
Users that are interested in data-contracts are comparing it to the libraries listed below
Sorting:
- Template for a data contract used in a data mesh.☆475Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆260Updated last month
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆218Updated last month
- Data product portal created by Dataminded☆190Updated last week
- Delta Lake helper methods in PySpark☆325Updated 11 months ago
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆178Updated last year
- Home of the Open Data Contract Standard (ODCS).☆535Updated last week
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆219Updated 4 months ago
- Possibly the fastest DataFrame-agnostic quality check library in town.☆202Updated last week
- Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆115Updated 5 months ago
- Data pipeline with dbt, Airflow, Great Expectations☆163Updated 4 years ago
- Code snippets for Data Engineering Design Patterns book☆150Updated 5 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated 2 weeks ago
- A Python Library to support running data quality rules while the spark job is running⚡☆189Updated this week
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆64Updated 3 months ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆169Updated last year
- ☆135Updated last week
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆207Updated last week
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆375Updated 3 months ago
- Sample configuration to deploy a modern data platform.☆88Updated 3 years ago
- A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆52Updated 9 months ago
- Demo of Streamlit application with Databricks SQL Endpoint☆34Updated 2 years ago
- A SQL port of python's scikit-learn preprocessing module, provided as cross-database dbt macros.☆185Updated 2 years ago
- Code for dbt tutorial☆159Updated 2 months ago
- ☆119Updated last month
- This repository provides various demos/examples of using Snowpark for Python.☆283Updated last year
- Ingesting data with Pulumi, AWS lambdas and Snowflake in a scalable, fully replayable manner☆71Updated 3 years ago
- 🧱 A collection of supplementary utilities and helper notebooks to perform admin tasks on Databricks☆56Updated last month
- Great Expectations Airflow operator☆167Updated last week
- Enforce Data Contracts☆674Updated this week