nydasco / real_time_streaming_pipeline
An example repository showing how to leverage Kafka to stream your data
☆21Updated 9 months ago
Alternatives and similar repositories for real_time_streaming_pipeline:
Users that are interested in real_time_streaming_pipeline are comparing it to the libraries listed below
- A Python package to help Databricks Unity Catalog users to read and query Delta Lake tables with Polars, DuckDb, or PyArrow.☆23Updated 11 months ago
- A portable Datamart and Business Intelligence suite built with Docker, sqlmesh + dbtcore, DuckDB and Superset☆48Updated 3 months ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆49Updated last year
- end-to-end data engineering project to get insights from PyPi using python, duckdb, MotherDuck & Evidence☆185Updated last week
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 6 months ago
- A demonstration of an ELT (Extract, Load, Transform) pipeline☆29Updated last year
- Code snippets for Data Engineering Design Patterns book☆73Updated last month
- A portable Datamart and Business Intelligence suite built with Docker, Dagster, dbt, DuckDB and Superset☆217Updated 2 weeks ago
- an ephemeral project repo for the DU Dagster project☆66Updated this week
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆127Updated 7 months ago
- In this repository we store all materials for dlt workshops, courses, etc.☆115Updated 2 months ago
- All things awesome related to Dagster!☆98Updated last week
- ☆111Updated 7 months ago
- Python wrapper for the Sling CLI tool☆45Updated 2 weeks ago
- ☆74Updated 4 months ago
- A Python package that creates fine-grained dbt tasks on Apache Airflow☆64Updated 5 months ago
- Project for "Data pipeline design patterns" blog.☆44Updated 6 months ago
- Contribute to dlt verified sources 🔥☆81Updated this week
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆56Updated 4 months ago
- A write-audit-publish implementation on a data lake without the JVM☆46Updated 6 months ago
- Local Environment to Practice Data Engineering☆142Updated 2 months ago
- build dw with dbt☆37Updated 4 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆63Updated last year
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆75Updated last year
- Cost Efficient Data Pipelines with DuckDB☆49Updated 7 months ago
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆31Updated 11 months ago
- ☆17Updated 6 months ago
- Code for dbt tutorial☆151Updated 9 months ago
- ☆32Updated 2 months ago
- Data-aware orchestration with dagster, dbt, and airbyte☆32Updated 2 years ago