gmrqs / lasagnaLinks
A Docker Compose template that builds a interactive development environment for PySpark with Jupyter Lab, MinIO as object storage, Hive Metastore, Trino and Kafka
☆47Updated 11 months ago
Alternatives and similar repositories for lasagna
Users that are interested in lasagna are comparing it to the libraries listed below
Sorting:
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- Delta Lake examples☆233Updated last year
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆75Updated 2 years ago
- Repo for everything open table formats (Iceberg, Hudi, Delta Lake) and the overall Lakehouse architecture☆124Updated 2 weeks ago
- Quick Guides from Dremio on Several topics☆79Updated last week
- ☆269Updated last year
- A Python Library to support running data quality rules while the spark job is running⚡☆193Updated this week
- Delta Lake Documentation☆51Updated last year
- Delta Lake helper methods in PySpark☆324Updated last year
- New Generation Opensource Data Stack Demo☆452Updated 2 years ago
- Docker with Airflow and Spark standalone cluster☆262Updated 2 years ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆223Updated 7 months ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆63Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆275Updated 8 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆274Updated last month
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆375Updated 6 months ago
- Code for dbt tutorial☆165Updated 2 months ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆222Updated last week
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆256Updated last month
- Delta Lake helper methods. No Spark dependency.☆23Updated last year
- Code for blog at: https://www.startdataengineering.com/post/docker-for-de/☆40Updated last year
- build dw with dbt☆49Updated last year
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆77Updated this week
- Turning PySpark Into a Universal DataFrame API☆455Updated last week
- This repository serves as a comprehensive guide to effective data modeling and robust data quality assurance using popular open-source to…☆35Updated 2 years ago
- New generation opensource data stack☆75Updated 3 years ago
- Simple stream processing pipeline☆110Updated last year
- Execution of DBT models using Apache Airflow through Docker Compose☆125Updated 2 years ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 5 months ago
- PySpark test helper methods with beautiful error messages☆730Updated 2 months ago