amanparmar17 / Kafka_PysparkLinks
Base Kafka Producer, consumer, flask api and PySpark Structured streaming Job
☆11Updated 3 years ago
Alternatives and similar repositories for Kafka_Pyspark
Users that are interested in Kafka_Pyspark are comparing it to the libraries listed below
Sorting:
- Testing Spark Structured Streaming anf Kafka with real data from traffic sensors☆16Updated 2 years ago
- ☆40Updated 2 years ago
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆42Updated 2 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆63Updated 2 years ago
- Mastering Big Data Analytics with PySpark, Published by Packt☆161Updated last year
- ☆44Updated last year
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆143Updated 2 years ago
- Apache Spark using SQL☆14Updated 4 years ago
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆37Updated 2 years ago
- End to end data engineering project☆57Updated 2 years ago
- Apache Spark 3 - Structured Streaming Course Material☆123Updated 2 years ago
- An end-to-end data engineering pipeline that orchestrates data ingestion, processing, and storage using Apache Airflow, Python, Apache Ka…☆274Updated 7 months ago
- PySpark Tutorial for Beginners - Practical Examples in Jupyter Notebook with Spark version 3.4.1. The tutorial covers various topics like…☆133Updated last year
- Delta-Lake, ETL, Spark, Airflow☆48Updated 2 years ago
- Data Engineering on GCP☆38Updated 2 years ago
- This repo gives an introduction to setting up streaming analytics using open source technologies☆25Updated 2 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆92Updated 6 years ago
- Pyspark Spotify ETL☆17Updated 4 years ago
- This project demonstrates how to use Apache Airflow to submit jobs to Apache spark cluster in different programming laguages using Python…☆44Updated last year
- This is a boilerplate which has dependencies for pyspark(3.3.0) mongo(>4.x) connectivity☆10Updated last year
- A Procedure To Create A Yarn Cluster Based on Docker, Run Spark, And Do TPC-DS Performance Test.☆16Updated last year
- Docker with Airflow and Spark standalone cluster☆261Updated 2 years ago
- A Data Engineering Project that implements an ETL data pipeline using Dagster, Apache Spark, Streamlit, MinIO, Metabase, Dbt, Polars, Doc…☆23Updated 10 months ago
- Code for dbt tutorial☆162Updated 2 weeks ago
- Simple ETL pipeline using Python☆27Updated 2 years ago
- Nyc_Taxi_Data_Pipeline - DE Project☆122Updated 11 months ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆101Updated 6 months ago
- Source code of the Apache Airflow Tutorial for Beginners on YouTube Channel Coder2j (https://www.youtube.com/c/coder2j)☆318Updated last year
- used Airflow, Postgres, Kafka, Spark, and Cassandra, and GitHub Actions to establish an end-to-end data pipeline☆29Updated last year
- Generate synthetic Spotify music stream dataset to create dashboards. Spotify API generates fake event data emitted to Kafka. Spark consu…☆69Updated last year