jeremySrgt / mini-modern-data-stackLinks
Deploy a complete data stack in just a couple of minutes.
☆15Updated last year
Alternatives and similar repositories for mini-modern-data-stack
Users that are interested in mini-modern-data-stack are comparing it to the libraries listed below
Sorting:
- Local Environment to Practice Data Engineering☆144Updated last year
- ☆179Updated 5 months ago
- Code for dbt tutorial☆168Updated 5 months ago
- build dw with dbt☆50Updated last year
- Template for a data contract used in a data mesh.☆486Updated last year
- Project utilising data from the Age of Empires api at 'https://aoestats.io'☆55Updated last year
- Building a Data Pipeline with an Open Source Stack☆56Updated 7 months ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆80Updated 2 years ago
- Code for blog at: https://www.startdataengineering.com/post/docker-for-de/☆40Updated last year
- Code snippets for Data Engineering Design Patterns book☆331Updated last month
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆47Updated last year
- ☆121Updated 6 months ago
- Sample project to demonstrate data engineering best practices☆202Updated last year
- This is project documentation templates derived from CRISP-DM to be used for Data Engineering projects.☆60Updated 4 years ago
- In this repository we store all materials for dlt workshops, courses, etc.☆248Updated 2 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆279Updated 4 months ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆67Updated 9 months ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆284Updated last year
- ☆147Updated last year
- ☆214Updated last year
- Code for "Efficient Data Processing in Spark" Course☆361Updated 3 months ago
- 🧱 A collection of supplementary utilities and helper notebooks to perform admin tasks on Databricks☆57Updated 7 months ago
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆20Updated 6 months ago
- Data pipeline with dbt, Airflow, Great Expectations☆166Updated 4 years ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆82Updated 2 weeks ago
- Data Product Portal created by Dataminded☆198Updated last week
- Home of the Open Data Contract Standard (ODCS).☆660Updated last month
- This repository provides various demos/examples of using Snowpark for Python.☆289Updated 2 months ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆211Updated last month
- ☆26Updated 2 years ago