sodadata / soda-sparkLinks
Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes
☆63Updated 3 years ago
Alternatives and similar repositories for soda-spark
Users that are interested in soda-spark are comparing it to the libraries listed below
Sorting:
- A Python Library to support running data quality rules while the spark job is running⚡☆194Updated last week
- Airflow Providers containing Deferrable Operators & Sensors from Astronomer☆149Updated last week
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆222Updated last month
- Delta Lake helper methods in PySpark☆326Updated last year
- Delta Lake examples☆236Updated last year
- Schema modelling framework for decentralised domain-driven ownership of data.☆260Updated 2 years ago
- Great Expectations Airflow operator☆169Updated last month
- Spark style guide☆271Updated last year
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆279Updated 3 months ago
- Enforce Best Practices for all your Airflow DAGs. ⭐☆107Updated this week
- This repository has moved into https://github.com/dbt-labs/dbt-adapters☆444Updated 6 months ago
- Soda SQL and Soda Spark have been deprecated and replaced by Soda Core. docs.soda.io/soda-core/overview.html☆62Updated 3 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆168Updated 2 years ago
- A repository of sample code to accompany our blog post on Airflow and dbt.☆183Updated 2 years ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆224Updated 8 months ago
- Apache Airflow integration for dbt☆411Updated last year
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- ☆202Updated 2 years ago
- Data Product Portal created by Dataminded☆197Updated this week
- Library to convert DBT manifest metadata to Airflow tasks☆49Updated 3 weeks ago
- PyJaws: A Pythonic Way to Define Databricks Jobs and Workflows☆45Updated last week
- Generate and Visualize Data Lineage from query history☆327Updated 2 years ago
- ✨ A Pydantic to PySpark schema library☆118Updated last week
- ☆42Updated 4 years ago
- Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.☆377Updated 7 months ago
- The Trino (https://trino.io/) adapter plugin for dbt (https://getdbt.com)☆253Updated last month
- A Table format agnostic data sharing framework☆42Updated last year
- Delta Lake Documentation☆52Updated last year
- Delta Lake helper methods. No Spark dependency.☆23Updated last year
- Fast iterative local development and testing of Apache Airflow workflows☆202Updated 3 weeks ago