adidas / lakehouse-engineLinks
The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for several lakehouse algorithms, data flows and utilities for Data Products.
☆278Updated 2 months ago
Alternatives and similar repositories for lakehouse-engine
Users that are interested in lakehouse-engine are comparing it to the libraries listed below
Sorting:
- Delta Lake helper methods in PySpark☆325Updated last year
- Delta Lake examples☆235Updated last year
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆222Updated 3 weeks ago
- A Python Library to support running data quality rules while the spark job is running⚡☆193Updated last week
- Data Product Portal created by Dataminded☆196Updated this week
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆65Updated 7 months ago
- Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆123Updated 8 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆80Updated this week
- Template for a data contract used in a data mesh.☆486Updated last year
- PySpark test helper methods with beautiful error messages☆740Updated last week
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆377Updated 7 months ago
- Scalefree's dbt package for a Data Vault 2.0 implementation congruent to the original Data Vault 2.0 definition by Dan Linstedt including…☆170Updated last week
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆208Updated last month
- ☆169Updated 4 months ago
- Custom PySpark Data Sources☆83Updated 2 weeks ago
- Home of the Open Data Contract Standard (ODCS).☆615Updated 2 weeks ago
- A repository of sample code to accompany our blog post on Airflow and dbt.☆182Updated 2 years ago
- Data pipeline with dbt, Airflow, Great Expectations☆165Updated 4 years ago
- PyJaws: A Pythonic Way to Define Databricks Jobs and Workflows☆45Updated last week
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆225Updated 7 months ago
- Code snippets for Data Engineering Design Patterns book☆296Updated last week
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆157Updated last year
- Code samples, etc. for Databricks☆73Updated 6 months ago
- The Data Contract Specification Repository☆401Updated 2 weeks ago
- Code for dbt tutorial☆165Updated 3 months ago
- This dbt package captures metadata, artifacts, and test results so you can detect anomalies, monitor data quality, and build metadata tab…☆473Updated this week
- This repository has moved into https://github.com/dbt-labs/dbt-adapters☆444Updated 5 months ago
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆257Updated last week
- Databricks Implementation of the TPC-DI Specification using Traditional Notebooks and/or Delta Live Tables☆91Updated 5 months ago
- DBSQL SME Repo contains demos, tutorials, blog code, advanced production helper functions and more!☆78Updated 3 weeks ago