tcmlabs / hexagonal-architecture-python-sparkLinks
Hexagonal (ports and adapters) architecture applied to Spark and Python data engineering project
☆33Updated 2 years ago
Alternatives and similar repositories for hexagonal-architecture-python-spark
Users that are interested in hexagonal-architecture-python-spark are comparing it to the libraries listed below
Sorting:
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆218Updated 3 months ago
- Code snippets for Data Engineering Design Patterns book☆142Updated 4 months ago
- A curated list of awesome blogs, videos, tools and resources about Data Contracts☆178Updated last year
- PySpark test helper methods with beautiful error messages☆709Updated 2 weeks ago
- Possibly the fastest DataFrame-agnostic quality check library in town.☆202Updated 3 weeks ago
- The Data Contract Specification Repository☆370Updated last month
- ☆132Updated last year
- ☆119Updated 3 weeks ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆218Updated 2 weeks ago
- Template for a data contract used in a data mesh.☆473Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆259Updated 2 weeks ago
- Delta Lake helper methods in PySpark☆325Updated 11 months ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆76Updated last year
- Project demonstrating how to automate Prefect 2.0 deployments to AWS ECS Fargate☆115Updated 2 years ago
- Just starting your DE journey or along the way already?. I will be sharing a short list of DATA-ENGINEERING-CENTRED books that covers the…☆34Updated 3 years ago
- Example repository showing how to build a data platform with Prefect, dbt and Snowflake☆104Updated 2 years ago
- ☆205Updated 6 months ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆50Updated last year
- Enforce Data Contracts☆662Updated this week
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes☆64Updated 3 years ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆70Updated 10 months ago
- Code for dbt tutorial☆159Updated 2 months ago
- Project for "Data pipeline design patterns" blog.☆45Updated last year
- ✨ A Pydantic to PySpark schema library☆99Updated last week
- Local Environment to Practice Data Engineering☆143Updated 7 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆189Updated this week
- Food for thoughts around data contracts☆26Updated 3 weeks ago
- Code for my "Efficient Data Processing in SQL" book.☆57Updated last year
- All things awesome related to Dagster!☆123Updated last month
- This is project documentation templates derived from CRISP-DM to be used for Data Engineering projects.☆55Updated 3 years ago