gmrqs / lasagnaLinks
A Docker Compose template that builds a interactive development environment for PySpark with Jupyter Lab, MinIO as object storage, Hive Metastore, Trino and Kafka
☆47Updated last year
Alternatives and similar repositories for lasagna
Users that are interested in lasagna are comparing it to the libraries listed below
Sorting:
- Delta Lake examples☆235Updated last year
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆75Updated 2 years ago
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- Repo for everything open table formats (Iceberg, Hudi, Delta Lake) and the overall Lakehouse architecture☆125Updated last month
- Code for dbt tutorial☆165Updated 3 months ago
- ☆269Updated last year
- New Generation Opensource Data Stack Demo☆453Updated 2 years ago
- Delta Lake helper methods in PySpark☆325Updated last year
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆257Updated last week
- Code snippets for Data Engineering Design Patterns book☆294Updated 9 months ago
- Delta Lake helper methods. No Spark dependency.☆23Updated last year
- Quick Guides from Dremio on Several topics☆79Updated last month
- Data Product Portal created by Dataminded☆196Updated last week
- Delta Lake Documentation☆51Updated last year
- Building Data Lakehouse by open source technology. Support end to end data pipeline, from source data on AWS S3 to Lakehouse, visualize a…☆34Updated this week
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆222Updated 2 weeks ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆63Updated 2 years ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆377Updated 7 months ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆225Updated 7 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆193Updated this week
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆277Updated 2 months ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆79Updated 2 years ago
- build dw with dbt☆49Updated last year
- Data pipeline with dbt, Airflow, Great Expectations☆165Updated 4 years ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆79Updated this week
- Building a Data Pipeline with an Open Source Stack☆55Updated 5 months ago
- The Trino (https://trino.io/) adapter plugin for dbt (https://getdbt.com)☆253Updated this week
- New generation opensource data stack☆76Updated 3 years ago
- Template for a data contract used in a data mesh.☆485Updated last year
- PySpark test helper methods with beautiful error messages☆739Updated 2 weeks ago