gtoonstra / databook
A facebook for data
☆26Updated 5 years ago
Alternatives and similar repositories for databook:
Users that are interested in databook are comparing it to the libraries listed below
- ETLy is an add-on dashboard service on top of Apache Airflow.☆70Updated last year
- PySpark for ETL jobs including lineage to Apache Atlas in one script via code inspection☆18Updated 8 years ago
- [ARCHIVED] The Presto adapter plugin for dbt Core☆33Updated last year
- A Spark-based data comparison tool at scale which facilitates software development engineers to compare a plethora of pair combinations o…☆50Updated last year
- ⚠️ MAINTENANCE-ONLY MODE: Snowplow maintained SQL data models for working with Snowplow web and mobile behavioral data.☆41Updated 3 months ago
- Scala SDK for working with Snowplow enriched events in Spark, AWS Lambda, Flink et al.☆21Updated 5 months ago
- Stores Snowplow enriched events in Redshift, Snowflake and Databricks☆31Updated last week
- type-class based data cleansing library for Apache Spark SQL☆78Updated 5 years ago
- The sane way of building a data layer in Airflow☆24Updated 5 years ago
- Run dbt serverless in the Cloud (AWS)☆42Updated 5 years ago
- Bulletproof Apache Spark jobs with fast root cause analysis of failures.☆72Updated 4 years ago
- SQL data model for working with Snowplow web data. Supports Redshift and Looker. Snowflake and BigQuery coming soon☆60Updated 4 years ago
- Airflow plugin to transfer arbitrary files between operators☆78Updated 6 years ago
- Lighthouse is a library for data lakes built on top of Apache Spark. It provides high-level APIs in Scala to streamline data pipelines an…☆61Updated 7 months ago
- DBT Cloud Plugin for Airflow☆38Updated 11 months ago
- Examples for High Performance Spark☆15Updated 5 months ago
- 📆 Run, schedule, and manage your dbt jobs using Kubernetes.☆24Updated 6 years ago
- Data validation library for PySpark 3.0.0☆33Updated 2 years ago
- Data ingestion library for Amundsen to build graph and search index☆205Updated last year
- A plugin to Apache Airflow to allow you to run Spark Submit Commands as an Operator☆73Updated 5 years ago
- Profiles the data, validates the schema and runs data quality checks and produces a report☆20Updated 5 years ago
- JSON schema parser for Apache Spark☆81Updated 2 years ago
- Mapping of DWH database tables to business entities, attributes & metrics in Python, with automatic creation of flattened tables☆73Updated last year
- Composable filesystem hooks and operators for Apache Airflow.☆17Updated 3 years ago
- Waimak is an open-source framework that makes it easier to create complex data flows in Apache Spark.☆75Updated 11 months ago
- ☆72Updated 4 years ago
- Rokku project. This project acts as a proxy on top of any S3 storage solution providing services like authentication, authorization, shor…☆66Updated last month
- Streaming data changes to a Data Lake with Debezium and Delta Lake pipeline☆75Updated 2 years ago
- A Getting Started Guide for developing and using Airflow Plugins☆93Updated 6 years ago
- Data models for snowplow analytics.☆128Updated 2 months ago