dogukannulu / streaming_data_processingLinks
Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO
☆63Updated 2 years ago
Alternatives and similar repositories for streaming_data_processing
Users that are interested in streaming_data_processing are comparing it to the libraries listed below
Sorting:
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆42Updated last year
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆56Updated 2 years ago
- ☆44Updated last year
- An end-to-end data engineering pipeline that orchestrates data ingestion, processing, and storage using Apache Airflow, Python, Apache Ka…☆285Updated 8 months ago
- Code for dbt tutorial☆162Updated last month
- Simple stream processing pipeline☆110Updated last year
- ☆70Updated this week
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- Nyc_Taxi_Data_Pipeline - DE Project☆128Updated last year
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆42Updated 2 years ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆103Updated 7 months ago
- Local Environment to Practice Data Engineering☆141Updated 10 months ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆157Updated 5 years ago
- This project demonstrates how to use Apache Airflow to submit jobs to Apache spark cluster in different programming laguages using Python…☆47Updated last year
- ☆88Updated 3 years ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆78Updated 2 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 4 years ago
- This project serves as a comprehensive guide to building an end-to-end data engineering pipeline using TCP/IP Socket, Apache Spark, OpenA…☆44Updated last year
- ☆29Updated last year
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆37Updated 2 years ago
- End to end data engineering project☆57Updated 3 years ago
- Python data repo, jupyter notebook, python scripts and data.☆535Updated 10 months ago
- Code snippets for Data Engineering Design Patterns book☆249Updated 7 months ago
- A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from loc…☆23Updated 3 years ago
- In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our …☆35Updated last year
- Sample project to demonstrate data engineering best practices☆197Updated last year
- This project shows how to capture changes from postgres database and stream them into kafka☆38Updated last year
- build dw with dbt☆47Updated last year
- PySpark Cheat Sheet - example code to help you learn PySpark and develop apps faster☆479Updated last year