dnguyenngoc / real-time-analyticLinks
This repo gives an introduction to setting up streaming analytics using open source technologies
☆25Updated 2 years ago
Alternatives and similar repositories for real-time-analytic
Users that are interested in real-time-analytic are comparing it to the libraries listed below
Sorting:
- Nyc_Taxi_Data_Pipeline - DE Project☆133Updated last year
- This project implements an ELT (Extract - Load - Transform) data pipeline with the goodreads dataset, using dagster (orchestration), spar…☆42Updated 2 years ago
- build dw with dbt☆50Updated last year
- My Setup Development Environment as Data Engineer☆34Updated 5 months ago
- Simple stream processing pipeline☆110Updated last year
- Transaction processing & vis pipeline using PySpark Streaming☆30Updated last year
- ☆45Updated last year
- ELT Data Pipeline implementation in Data Warehousing environment☆30Updated 8 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆75Updated 2 years ago
- End-to-end data platform leveraging the Modern data stack☆52Updated last year
- Project for real-time anomaly detection using Kafka and python☆59Updated 3 years ago
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆65Updated 2 years ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆107Updated 3 weeks ago
- Code for dbt tutorial☆166Updated 4 months ago
- In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our …☆38Updated 2 years ago
- Building a Data Pipeline with an Open Source Stack☆55Updated 7 months ago
- Building Data Lakehouse by open source technology. Support end to end data pipeline, from source data on AWS S3 to Lakehouse, visualize a…☆35Updated last month
- This repository contains an Apache Flink application for real-time sales analytics built using Docker Compose to orchestrate the necessar…☆47Updated 2 years ago
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆44Updated last year
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆144Updated 2 years ago
- ☆58Updated last year
- ☆41Updated 3 years ago
- Delta-Lake, ETL, Spark, Airflow☆48Updated 3 years ago
- This project serves as a comprehensive guide to building an end-to-end data engineering pipeline using TCP/IP Socket, Apache Spark, OpenA…☆44Updated 2 years ago
- A list of all my posts and personal projects☆73Updated 3 months ago
- Near real time ETL to populate a dashboard.☆73Updated 4 months ago
- ☆46Updated 2 years ago
- A Docker Compose template that builds a interactive development environment for PySpark with Jupyter Lab, MinIO as object storage, Hive M…☆47Updated last year
- End-to-end data platform: A PoC Data Platform project utilizing modern data stack (Spark, Airflow, DBT, Trino, Lightdash, Hive metastore,…☆47Updated last year
- ☆56Updated last year