agile-lab-dev / Data-Product-SpecificationLinks
An open specification for data products in Data Mesh
☆61Updated 9 months ago
Alternatives and similar repositories for Data-Product-Specification
Users that are interested in Data-Product-Specification are comparing it to the libraries listed below
Sorting:
- Support for generating modern platforms dynamically with services such as Kafka, Spark, Streamsets, HDFS, ....☆76Updated last week
- Schema modelling framework for decentralised domain-driven ownership of data.☆256Updated last year
- Data product portal created by Dataminded☆190Updated this week
- The Data Product Descriptor Specification (DPDS) Repository☆80Updated 7 months ago
- Generate authentic looking mock data based on a SQL, JSON or Avro schema and produce to Kafka in JSON or Avro format.☆165Updated 9 months ago
- Sample configuration to deploy a modern data platform.☆88Updated 3 years ago
- A Table format agnostic data sharing framework☆38Updated last year
- Yet Another (Spark) ETL Framework☆21Updated last year
- Delta Lake examples☆226Updated 10 months ago
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆178Updated last year
- Home of the Open Data Contract Standard (ODCS).☆528Updated last week
- Library to convert DBT manifest metadata to Airflow tasks☆48Updated last year
- Data Tools Subjective List☆86Updated 2 years ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆218Updated last month
- Template for a data contract used in a data mesh.☆475Updated last year
- ☆96Updated 2 years ago
- Weekly Data Engineering Newsletter☆96Updated last year
- The Data Contract Specification Repository☆371Updated last month
- Data Quality and Observability platform for the whole data lifecycle, from profiling new data sources to full automation with Data Observ…☆161Updated last month
- A Python Library to support running data quality rules while the spark job is running⚡☆189Updated this week
- Data Mesh Architecture☆81Updated last year
- ODD Specification is a universal open standard for collecting metadata.☆143Updated 9 months ago
- ☆34Updated 3 months ago
- Rules based grant management for Snowflake☆40Updated 6 years ago
- Unity Catalog UI☆42Updated 11 months ago
- A write-audit-publish implementation on a data lake without the JVM☆46Updated last year
- Adapter for dbt that executes dbt pipelines on Apache Flink☆95Updated last year
- Make dbt docs and Apache Superset talk to one another☆149Updated 7 months ago
- The Picnic Data Vault framework.☆129Updated last year
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆88Updated 2 years ago