Aiven-Labs / python-fake-data-producer-for-apache-kafkaLinks
The Python fake data producer for Apache Kafka® is a complete demo app allowing you to quickly produce JSON fake streaming datasets and push it to an Apache Kafka topic.
☆85Updated last year
Alternatives and similar repositories for python-fake-data-producer-for-apache-kafka
Users that are interested in python-fake-data-producer-for-apache-kafka are comparing it to the libraries listed below
Sorting:
- ☆23Updated 4 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- A Series of Notebooks on how to start with Kafka and Python☆152Updated 8 months ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆223Updated 6 months ago
- ☆48Updated 3 years ago
- New generation opensource data stack☆75Updated 3 years ago
- Full stack data engineering tools and infrastructure set-up☆57Updated 4 years ago
- Delta Lake Documentation☆50Updated last year
- Enforce Best Practices for all your Airflow DAGs. ⭐☆105Updated last week
- Cost Efficient Data Pipelines with DuckDB☆58Updated 6 months ago
- A simple and easy to use Data Quality (DQ) tool built with Python.☆50Updated 2 years ago
- Auto-generated Diagrams from Airflow DAGs. 🔮 🪄☆352Updated last week
- Airflow Unit Tests and Integration Tests☆261Updated 3 years ago
- Data validation library for PySpark 3.0.0☆33Updated 3 years ago
- Public source code for the Batch Processing with Apache Beam (Python) online course☆18Updated 5 years ago
- Cloned by the `dbt init` task☆62Updated last year
- Docker Airflow - Contains a docker compose file for Airflow 2.0☆69Updated 3 years ago
- Pylint plugin for static code analysis on Airflow code☆96Updated 5 years ago
- Utility functions for dbt projects running on Spark☆33Updated 2 weeks ago
- Great Expectations Airflow operator☆169Updated last week
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes☆64Updated 3 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆169Updated 2 years ago
- Fast iterative local development and testing of Apache Airflow workflows☆202Updated 3 months ago
- A complete development environment setup for working with Airflow☆128Updated 2 years ago
- Complete data engineering pipeline running on Minikube Kubernetes, Argo CD, Spark, Trino, S3, Delta lake, Postgres+ Debezium CDC, MySQL,…☆28Updated 5 months ago
- ☆31Updated 2 years ago
- Delta Lake examples☆231Updated last year
- Swiple enables you to easily observe, understand, validate and improve the quality of your data☆84Updated this week
- event-triggered plugins for airflow☆21Updated 5 years ago
- A modern ELT demo using airbyte, dbt, snowflake and dagster☆28Updated 2 years ago