ssp-data / data-engineering-devops
Full stack data engineering tools and infrastructure set-up
☆50Updated 4 years ago
Alternatives and similar repositories for data-engineering-devops:
Users that are interested in data-engineering-devops are comparing it to the libraries listed below
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 7 months ago
- A modern ELT demo using airbyte, dbt, snowflake and dagster☆27Updated 2 years ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆50Updated last year
- Cost Efficient Data Pipelines with DuckDB☆51Updated 8 months ago
- ☆17Updated 7 months ago
- Data-aware orchestration with dagster, dbt, and airbyte☆31Updated 2 years ago
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆78Updated last year
- ☆75Updated 5 months ago
- Utility functions for dbt projects running on Spark☆31Updated last month
- A repository of sample code to show data quality checking best practices using Airflow.☆75Updated 2 years ago
- New generation opensource data stack☆65Updated 2 years ago
- Template for Data Engineering and Data Pipeline projects☆109Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆75Updated 2 weeks ago
- Example repo to create end to end tests for data pipeline.☆23Updated 9 months ago
- ☆33Updated 3 weeks ago
- Duke MIDS: Data Engineering and DataOps Course☆65Updated 2 months ago
- A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆49Updated 4 months ago
- how to unit test your PySpark code☆28Updated 4 years ago
- Cloned by the `dbt init` task☆61Updated 11 months ago
- ☆49Updated 3 years ago
- All the Snowflake Virtual Warehouse - Example☆12Updated 4 years ago
- ☆84Updated 2 years ago
- A curated list of dagster code snippets for data engineers☆54Updated last year
- reating a modern data pipeline using a combination of Terraform, AWS Lambda and S3, Snowflake, DBT, Mage AI, and Dash.☆14Updated last year
- ☆20Updated 3 years ago
- ☆15Updated 11 months ago
- A Python package to help Databricks Unity Catalog users to read and query Delta Lake tables with Polars, DuckDb, or PyArrow.☆23Updated last year
- Data Quality and Observability platform for the whole data lifecycle, from profiling new data sources to full automation with Data Observ…☆134Updated 2 months ago
- Delta Lake Documentation☆49Updated 9 months ago
- Execution of DBT models using Apache Airflow through Docker Compose☆116Updated 2 years ago