palantir / pyspark-style-guideLinks
This is a guide to PySpark code style presenting common situations and the associated best practices based on the most frequent recurring topics across the PySpark repos we've encountered.
☆1,179Updated last month
Alternatives and similar repositories for pyspark-style-guide
Users that are interested in pyspark-style-guide are comparing it to the libraries listed below
Sorting:
- PySpark test helper methods with beautiful error messages☆717Updated 3 weeks ago
- Python API for Deequ☆797Updated 6 months ago
- pyspark methods to enhance developer productivity 📣 👯 🎉☆674Updated 7 months ago
- Delta Lake helper methods in PySpark☆324Updated last year
- Port(ish) of Great Expectations to dbt test macros☆1,203Updated 9 months ago
- Data quality testing for the modern data stack (SQL, Spark, and Pandas) https://www.soda.io☆2,182Updated this week
- The easiest way to run Airflow locally, with linting & tests for valid DAGs and Plugins.☆257Updated 4 years ago
- Construct Apache Airflow DAGs Declaratively via YAML configuration files☆1,346Updated last week
- Template for a data contract used in a data mesh.☆476Updated last year
- A curated list of awesome dbt resources☆1,556Updated last month
- ☆831Updated 5 months ago
- 🐍 Quick reference guide to common patterns & functions in PySpark.☆615Updated 2 years ago
- Code for Data Pipelines with Apache Airflow☆799Updated last year
- Implementing best practices for PySpark ETL jobs and applications.☆2,004Updated 2 years ago
- A self-contained dbt project for testing purposes☆509Updated last year
- Run your dbt Core or dbt Fusion projects as Apache Airflow DAGs and Task Groups with a few lines of code☆1,050Updated last week
- Open Source LeetCode for PySpark, Spark, Pandas and DBT/Snowflake☆212Updated 3 months ago
- Data pipeline with dbt, Airflow, Great Expectations☆163Updated 4 years ago
- Apache Airflow integration for dbt☆406Updated last year
- Spark style guide☆263Updated last year
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆375Updated 4 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆268Updated last month
- This repository goes over how to handle massive variety in data engineering☆303Updated 2 years ago
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆277Updated last year
- Guides and docs to help you get up and running with Apache Airflow.☆811Updated 2 months ago
- Code for "Efficient Data Processing in Spark" Course☆340Updated 4 months ago
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆480Updated 11 months ago
- Collection of dbt Tips and Tricks☆394Updated 2 years ago
- Utility functions for dbt projects.☆1,611Updated 3 weeks ago
- Educational project on how to build an ETL (Extract, Transform, Load) data pipeline, orchestrated with Airflow.☆333Updated 3 years ago