mrpowers-io / quinn
pyspark methods to enhance developer productivity π£ π― π
β661Updated 2 months ago
Alternatives and similar repositories for quinn:
Users that are interested in quinn are comparing it to the libraries listed below
- PySpark test helper methods with beautiful error messagesβ663Updated last month
- Spark style guideβ257Updated 4 months ago
- Delta Lake helper methods in PySparkβ315Updated 5 months ago
- Python API for Deequβ744Updated 4 months ago
- Essential Spark extensions and helper methods β¨π²β756Updated 3 months ago
- A Python Library to support running data quality rules while the spark job is runningβ‘β173Updated this week
- dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricksβ419Updated last week
- Apache Airflow integration for dbtβ401Updated 9 months ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflowβ199Updated last week
- This is the development repository for sparkMeasure, a tool and library designed for efficient analysis and troubleshooting of Apache Spaβ¦β727Updated 2 weeks ago
- A simplified, lightweight ETL Framework based on Apache Sparkβ586Updated last year
- β198Updated last year
- Apache Spark testing helpers (dependency free & works with Scalatest, uTest, and MUnit)β439Updated this week
- A library that provides useful extensions to Apache Spark and PySpark.β214Updated 2 months ago
- Qubole Sparklens tool for performance tuning Apache Sparkβ570Updated 7 months ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.β166Updated last year
- Create HTML profiling reports from Apache Spark DataFramesβ195Updated 5 years ago
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframesβ63Updated 2 years ago
- Snowflake Data Source for Apache Spark.β224Updated 2 months ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.β362Updated this week
- A boilerplate for writing PySpark Jobsβ397Updated last year
- Schema modelling framework for decentralised domain-driven ownership of data.β250Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for severβ¦β234Updated 2 weeks ago
- pytest plugin to run the tests with support of pysparkβ85Updated 11 months ago
- Testing framework for Databricks notebooksβ294Updated 10 months ago
- Data Lineage Tracking And Visualization Solutionβ612Updated 2 weeks ago
- Great Expectations Airflow operatorβ159Updated this week
- Delta Lake examplesβ217Updated 4 months ago
- β43Updated 3 years ago
- Template for a data contract used in a data mesh.β467Updated 11 months ago