zzstoatzz / oreilly-workflow-orchestration
☆26Updated 3 years ago
Alternatives and similar repositories for oreilly-workflow-orchestration:
Users that are interested in oreilly-workflow-orchestration are comparing it to the libraries listed below
- A very simple "hello world" project for deploying Prefect 2 to a docker container on Google Compute Engine.☆11Updated 2 years ago
- Code examples showing flow deployment to various types of infrastructure☆106Updated 2 years ago
- Collection of code snippets for blogs, conferences, and talks☆23Updated 2 years ago
- Prefect 2 flows☆11Updated 5 months ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆50Updated last year
- Examples of various flow deployments for Prefect 1.0 (storage and run configurations)☆35Updated 2 years ago
- ☆27Updated 2 years ago
- The easiest way to integrate Kedro and Great Expectations☆53Updated 2 years ago
- Collection of Prefect integrations for working with dbt with your Prefect flows.☆83Updated last year
- Data-aware orchestration with dagster, dbt, and airbyte☆31Updated 2 years ago
- Dask integration for Snowflake☆30Updated 5 months ago
- Deploy a Prefect flow to serverless AWS Lambda function☆35Updated 2 years ago
- Tutorials for Fugue - A unified interface for distributed computing. Fugue executes SQL, Python, and Pandas code on Spark and Dask withou…☆113Updated last year
- ☆29Updated last year
- Pocket data flows orchestrated using Prefect☆45Updated last month
- A decorator that sends alert when a Prefect flow fails☆14Updated 2 years ago
- Example project for building scalable data pipelines with Kedro and Ibis.☆13Updated last year
- ☆16Updated last year
- Demo on how to use Prefect 2 in an ML project☆41Updated 2 years ago
- csv and flat-file sniffer built in Rust.☆42Updated last year
- ☆21Updated 3 years ago
- Pipeline definitions for managing data flows to power analytics at MIT Open Learning☆43Updated this week
- Full stack data engineering tools and infrastructure set-up☆52Updated 4 years ago
- A template repository with all the fundamentals needed to develop and deploy a Python data-processing routine for Prefect pipelines.☆20Updated 3 years ago
- A "modern" Strava data pipeline fueled by dlt, duckdb, dbt, and evidence.dev☆32Updated 3 months ago
- A GitHub Action that makes it easy to use Great Expectations to validate your data pipelines in your CI workflows.☆80Updated 11 months ago
- A quick and easy way to start learning about Prefect☆10Updated 2 years ago
- Prefect integrations for working with Docker☆43Updated last year
- A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆50Updated 5 months ago
- Fugue collections for Prefect 2.0☆38Updated last year