amanparmar17 / Kafka_PysparkLinks
Base Kafka Producer, consumer, flask api and PySpark Structured streaming Job
☆11Updated 4 years ago
Alternatives and similar repositories for Kafka_Pyspark
Users that are interested in Kafka_Pyspark are comparing it to the libraries listed below
Sorting:
- Testing Spark Structured Streaming anf Kafka with real data from traffic sensors☆16Updated 2 years ago
- ☆47Updated 2 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆64Updated 2 years ago
- ☆44Updated last year
- Project for real-time anomaly detection using Kafka and python☆58Updated 2 years ago
- This repository contains an Apache Flink application for real-time sales analytics built using Docker Compose to orchestrate the necessar…☆45Updated last year
- Apache Spark using SQL☆14Updated 4 years ago
- This project demonstrates how to use Apache Airflow to submit jobs to Apache spark cluster in different programming laguages using Python…☆46Updated last year
- An end-to-end data engineering pipeline that orchestrates data ingestion, processing, and storage using Apache Airflow, Python, Apache Ka…☆287Updated 8 months ago
- Mastering Big Data Analytics with PySpark, Published by Packt☆163Updated last year
- Apche Spark Structured Streaming with Kafka using Python(PySpark)☆40Updated 6 years ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆93Updated 6 years ago
- Docker with Airflow + Postgres + Spark cluster + JDK (spark-submit support) + Jupyter Notebooks☆24Updated 3 years ago
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆42Updated 2 years ago
- This repo gives an introduction to setting up streaming analytics using open source technologies☆25Updated 2 years ago
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆37Updated 2 years ago
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆157Updated 5 years ago
- Project for "Data pipeline design patterns" blog.☆46Updated last year
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆56Updated 2 years ago
- End to end data engineering project☆57Updated 3 years ago
- used Airflow, Postgres, Kafka, Spark, and Cassandra, and GitHub Actions to establish an end-to-end data pipeline☆29Updated 2 years ago
- ☆40Updated 2 years ago
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆38Updated 4 years ago
- The practical use-cases of how to make your Machine Learning Pipelines robust and reliable using Apache Airflow.☆52Updated 2 years ago
- Simple stream processing pipeline☆110Updated last year
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- I am using confluent Kafka cluster to produce and consume scraped data. In this project, I've created a real-time data pipeline that uti…☆30Updated 2 years ago
- Nyc_Taxi_Data_Pipeline - DE Project☆129Updated last year