tcmlabs / hexagonal-architecture-python-sparkLinks
Hexagonal (ports and adapters) architecture applied to Spark and Python data engineering project
☆33Updated 2 years ago
Alternatives and similar repositories for hexagonal-architecture-python-spark
Users that are interested in hexagonal-architecture-python-spark are comparing it to the libraries listed below
Sorting:
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆226Updated this week
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆225Updated 9 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆81Updated last week
- Code snippets for Data Engineering Design Patterns book☆331Updated last month
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆182Updated last year
- ☆120Updated 6 months ago
- Containerized end-to-end analytics of Spotify data using Python, dbt, Postgres, and Metabase☆132Updated 3 years ago
- Possibly the fastest DataFrame-agnostic quality check library in town.☆236Updated 3 months ago
- Template for a data contract used in a data mesh.☆486Updated last year
- Code for dbt tutorial☆167Updated 4 months ago
- The Data Contract Specification Repository☆403Updated last month
- Delta Lake helper methods in PySpark☆327Updated 2 weeks ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆51Updated 2 years ago
- Enforce Data Contracts☆798Updated this week
- Data pipeline with dbt, Airflow, Great Expectations☆166Updated 4 years ago
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes☆63Updated 3 years ago
- A Python Library to support running data quality rules while the spark job is running⚡☆197Updated this week
- ☆214Updated last year
- My first attempt at a rough ETL pipeline; technologies include spark, GCS, prefect orchestration, and terraform☆14Updated 3 years ago
- Example repository showing how to build a data platform with Prefect, dbt and Snowflake☆109Updated 3 years ago
- Project demonstrating how to automate Prefect 2.0 deployments to AWS ECS Fargate☆115Updated 2 years ago
- Just starting your DE journey or along the way already?. I will be sharing a short list of DATA-ENGINEERING-CENTRED books that covers the…☆34Updated 3 years ago
- ☆179Updated 5 months ago
- ✨ A Pydantic to PySpark schema library☆118Updated last week
- Delta Lake Documentation☆53Updated last year
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆211Updated last month
- Code for "Efficient Data Processing in Spark" Course☆360Updated 3 months ago
- Run, mock and test fake Snowflake databases locally.☆169Updated this week
- Apache Airflow integration for dbt☆411Updated last year
- Enforce Best Practices for all your Airflow DAGs. ⭐☆108Updated last week