adidas / lakehouse-engine
The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for several lakehouse algorithms, data flows and utilities for Data Products.
☆235Updated last month
Alternatives and similar repositories for lakehouse-engine:
Users that are interested in lakehouse-engine are comparing it to the libraries listed below
- Delta Lake helper methods in PySpark☆322Updated 6 months ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆207Updated 3 weeks ago
- A Python Library to support running data quality rules while the spark job is running⚡☆180Updated this week
- Delta Lake examples☆218Updated 5 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆64Updated 5 months ago
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricks☆422Updated last month
- PySpark test helper methods with beautiful error messages☆672Updated 2 weeks ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆59Updated 5 months ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆193Updated last month
- A repository of sample code to accompany our blog post on Airflow and dbt.☆170Updated last year
- Data pipeline with dbt, Airflow, Great Expectations☆161Updated 3 years ago
- Data product portal created by Dataminded☆180Updated this week
- This repo is a collection of tools to deploy, manage and operate a Databricks based Lakehouse.☆44Updated last month
- ☆112Updated 7 months ago
- Code snippets for Data Engineering Design Patterns book☆74Updated last month
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆172Updated 7 months ago
- Demo of using the Nutter for testing of Databricks notebooks in the CI/CD pipeline☆150Updated 7 months ago
- Code for dbt tutorial☆153Updated 9 months ago
- Home of the Open Data Contract Standard (ODCS).☆464Updated last week
- Local Environment to Practice Data Engineering☆142Updated 2 months ago
- PyJaws: A Pythonic Way to Define Databricks Jobs and Workflows☆43Updated 8 months ago
- Custom PySpark Data Sources☆41Updated 2 months ago
- Scalefree's dbt package for a Data Vault 2.0 implementation congruent to the original Data Vault 2.0 definition by Dan Linstedt including…☆151Updated this week
- Code samples, etc. for Databricks☆63Updated this week
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆367Updated this week
- 🧱 A collection of supplementary utilities and helper notebooks to perform admin tasks on Databricks☆54Updated 3 months ago
- Spark style guide☆258Updated 5 months ago
- Delta Lake Documentation☆49Updated 9 months ago
- Template for a data contract used in a data mesh.☆470Updated last year