dogukannulu / kafka_spark_structured_streaming
Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra
☆137Updated last year
Alternatives and similar repositories for kafka_spark_structured_streaming:
Users that are interested in kafka_spark_structured_streaming are comparing it to the libraries listed below
- ☆136Updated 2 years ago
- An end-to-end data engineering pipeline that orchestrates data ingestion, processing, and storage using Apache Airflow, Python, Apache Ka…☆244Updated 2 months ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆91Updated last month
- ☆40Updated 9 months ago
- This repository will contain all of the resources for the Mage component of the Data Engineering Zoomcamp: https://github.com/DataTalksCl…☆98Updated 8 months ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆60Updated last year
- Near real time ETL to populate a dashboard.☆73Updated 10 months ago
- Stream processing pipeline from Finnhub websocket using Spark, Kafka, Kubernetes and more☆345Updated last year
- A template repository to create a data project with IAC, CI/CD, Data migrations, & testing☆260Updated 9 months ago
- ☆151Updated 2 years ago
- YouTube tutorial project☆101Updated last year
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆35Updated last year
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆36Updated last year
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆65Updated 2 months ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆143Updated 4 years ago
- Code for "Efficient Data Processing in Spark" Course☆293Updated 6 months ago
- Sample project to demonstrate data engineering best practices☆186Updated last year
- This project provides a comprehensive data pipeline solution to extract, transform, and load (ETL) Reddit data into a Redshift data wareh…☆129Updated last year
- Docker with Airflow and Spark standalone cluster☆255Updated last year
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆41Updated last year
- Price Crawler - Tracking Price Inflation☆185Updated 4 years ago
- Pipeline that extracts data from Crinacle's Headphone and InEarMonitor databases and finalizes data for a Metabase Dashboard. The dashboa…☆230Updated 2 years ago
- Local Environment to Practice Data Engineering☆142Updated 3 months ago
- Simple ETL pipeline using Python☆26Updated last year
- Projects done in the Data Engineer Nanodegree Program by Udacity.com☆160Updated 2 years ago
- Data Engineering YouTube Analysis Project by Darshil Parmar☆190Updated last year
- ☆87Updated 2 years ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆67Updated last year
- used Airflow, Postgres, Kafka, Spark, and Cassandra, and GitHub Actions to establish an end-to-end data pipeline☆27Updated last year
- Apartments Data Pipeline using Airflow and Spark.☆20Updated 3 years ago