adidas / lakehouse-engine
The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for several lakehouse algorithms, data flows and utilities for Data Products.
☆245Updated 3 months ago
Alternatives and similar repositories for lakehouse-engine:
Users that are interested in lakehouse-engine are comparing it to the libraries listed below
- Delta Lake helper methods in PySpark☆322Updated 8 months ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆215Updated last week
- Delta Lake examples☆224Updated 7 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆187Updated this week
- ☆114Updated 9 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆68Updated 7 months ago
- PySpark test helper methods with beautiful error messages☆686Updated 3 weeks ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆59Updated 6 months ago
- Code samples, etc. for Databricks☆64Updated last month
- Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆114Updated last month
- Data product portal created by Dataminded☆184Updated this week
- Code snippets for Data Engineering Design Patterns book☆101Updated last month
- Custom PySpark Data Sources☆50Updated last week
- Local Environment to Practice Data Engineering☆143Updated 4 months ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆368Updated last week
- DBSQL SME Repo contains demos, tutorials, blog code, advanced production helper functions and more!☆60Updated 3 weeks ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆196Updated this week
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆219Updated last week
- Turning PySpark Into a Universal DataFrame API☆390Updated last week
- Possibly the fastest DataFrame-agnostic quality check library in town.☆188Updated this week
- Scalefree's dbt package for a Data Vault 2.0 implementation congruent to the original Data Vault 2.0 definition by Dan Linstedt including…☆153Updated 2 weeks ago
- This repo is a collection of tools to deploy, manage and operate a Databricks based Lakehouse.☆45Updated 3 months ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆139Updated 9 months ago
- Template for a data contract used in a data mesh.☆472Updated last year
- Demo of using the Nutter for testing of Databricks notebooks in the CI/CD pipeline☆152Updated 8 months ago
- Spark style guide☆258Updated 7 months ago
- Code for "Efficient Data Processing in Spark" Course☆299Updated 7 months ago
- A repository of sample code to accompany our blog post on Airflow and dbt.☆172Updated last year
- The resources of the preparation course for Databricks Data Engineer Professional certification exam☆113Updated last month
- Delta Lake Documentation☆49Updated 10 months ago