MDS-BD / hands-on-great-expectations-with-sparkLinks
How to evaluate the Quality of your Data with Great Expectations and Spark.
☆31Updated 2 years ago
Alternatives and similar repositories for hands-on-great-expectations-with-spark
Users that are interested in hands-on-great-expectations-with-spark are comparing it to the libraries listed below
Sorting:
- PyJaws: A Pythonic Way to Define Databricks Jobs and Workflows☆43Updated 10 months ago
- Data validation library for PySpark 3.0.0☆33Updated 2 years ago
- Fake Pandas / PySpark DataFrame creator☆47Updated last year
- Delta Lake helper methods. No Spark dependency.☆23Updated 8 months ago
- Code snippets used in demos recorded for the blog.☆37Updated last month
- Declarative text based tool for data analysts and engineers to extract, load, transform and orchestrate their data pipelines.☆117Updated this week
- Spark and Delta Lake Workshop☆22Updated 2 years ago
- Ingesting data with Pulumi, AWS lambdas and Snowflake in a scalable, fully replayable manner☆71Updated 3 years ago
- ☆19Updated 2 years ago
- A Python PySpark Projet with Poetry☆23Updated 8 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆188Updated last week
- dbt's adapter for dremio☆48Updated 2 years ago
- Magic to help Spark pipelines upgrade☆35Updated 8 months ago
- A flake8 plugin that detects of usage withColumn in a loop or inside reduce☆27Updated 4 months ago
- A dbt (data build tool) project you can use for testing purposes or experimentation☆36Updated last year
- Weekly Data Engineering Newsletter☆95Updated 10 months ago
- Demo repository to lambda-fy your dbt runs☆11Updated last year
- A lightweight helper utility which allows developers to do interactive pipeline development by having a unified source code for both DLT …☆49Updated 2 years ago
- Evaluation Matrix for Change Data Capture☆25Updated 10 months ago
- A cool simple example of functional data engineering☆33Updated 2 years ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆216Updated 3 weeks ago
- Library to convert DBT manifest metadata to Airflow tasks☆48Updated last year
- Trino dbt demo project to mix and load BigQuery data with and in a local PostgreSQL database☆75Updated 3 years ago
- A SQL port of python's scikit-learn preprocessing module, provided as cross-database dbt macros.☆184Updated last year
- The go to demo for public and private dbt Learn☆77Updated 2 months ago
- A repository of sample code to show data quality checking best practices using Airflow.☆77Updated 2 years ago
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes☆63Updated 2 years ago
- Delta Lake examples☆225Updated 7 months ago
- A tool to validate data, built around Apache Spark.☆101Updated 3 weeks ago
- ☆148Updated this week