Pathairush / airflow_hive_spark_sqoop
A docker using the airflow with Hadoop ecosystem (hive, spark, and sqoop)
☆11Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for airflow_hive_spark_sqoop
- Infrastructure automation to deploy Hadoop,Hive,Spark,airflow nodes on a docker host☆20Updated 5 years ago
- Hadoop-Hive-Spark cluster + Jupyter on Docker☆61Updated 5 months ago
- Dockerizing an Apache Spark Standalone Cluster☆43Updated 2 years ago
- Creation of a data lakehouse and an ELT pipeline to enable the efficient analysis and use of data☆40Updated 11 months ago
- Apache Spark 3 - Structured Streaming Course Material☆119Updated last year
- Docker with Airflow and Spark standalone cluster☆246Updated last year
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆133Updated 4 years ago
- Data Engineering with Spark and Delta Lake☆89Updated last year
- Apache Spark Course Material☆85Updated last year
- ETL pipeline using pyspark (Spark - Python)☆108Updated 4 years ago
- Code for dbt tutorial☆143Updated 5 months ago
- Base Docker image with just essentials: Hadoop, Hive and Spark.☆67Updated 3 years ago
- Delta Lake examples☆208Updated last month
- Apache Spark 3 - Structured Streaming Course Material☆43Updated 4 years ago
- Delta-Lake, ETL, Spark, Airflow☆44Updated 2 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆53Updated last year
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆39Updated 3 years ago
- Spark Examples☆124Updated 2 years ago
- Apche Spark Structured Streaming with Kafka using Python(PySpark)☆41Updated 5 years ago
- O'Reilly Book: [Data Algorithms with Spark] by Mahmoud Parsian☆209Updated last year
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆57Updated last year
- With everything I learned from DEZoomcamp from datatalks.club, this project performs a batch processing on AWS for the cycling dataset wh…☆12Updated 2 years ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆32Updated 4 years ago
- Spark all the ETL Pipelines☆32Updated last year
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆99Updated 3 years ago
- ☆26Updated 4 years ago
- ☆13Updated last year
- Multi-container environment with Hadoop, Spark and Hive☆203Updated 10 months ago
- ☆23Updated 3 years ago