akashmehta10 / cdc_pyspark_hive
☆22Updated 2 years ago
Alternatives and similar repositories for cdc_pyspark_hive:
Users that are interested in cdc_pyspark_hive are comparing it to the libraries listed below
- ☆25Updated last year
- Code snippets for Data Engineering Design Patterns book☆69Updated 2 weeks ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆60Updated last year
- velib-v2: An ETL pipeline that employs batch and streaming jobs using Spark, Kafka, Airflow, and other tools, all orchestrated with Docke…☆18Updated 5 months ago
- how to unit test your PySpark code☆28Updated 3 years ago
- ☆14Updated 5 years ago
- End to end data engineering project☆53Updated 2 years ago
- Code for dbt tutorial☆151Updated 8 months ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆53Updated 4 months ago
- Pyspark boilerplate for running prod ready data pipeline☆28Updated 3 years ago
- Delta-Lake, ETL, Spark, Airflow☆46Updated 2 years ago
- Delta Lake examples☆217Updated 4 months ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 6 months ago
- A custom end-to-end analytics platform for customer churn☆10Updated 3 weeks ago
- Execution of DBT models using Apache Airflow through Docker Compose☆114Updated 2 years ago
- Delta Lake helper methods. No Spark dependency.☆22Updated 5 months ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆53Updated last year
- This repository will help you to learn about databricks concept with the help of examples. It will include all the important topics which…☆95Updated 6 months ago
- Quick Guides from Dremio on Several topics☆67Updated last month
- Unit testing using databricks connect☆30Updated 3 years ago
- Data engineering with dbt, published by Packt☆72Updated 11 months ago
- A self-contained, ready to run Airflow ELT project. Can be run locally or within codespaces.☆63Updated last year
- Building a Modern Data Lake with Minio, Spark, Airflow via Docker.☆16Updated 9 months ago
- ☆74Updated 4 months ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆126Updated 7 months ago
- ☆20Updated 11 months ago
- Project for "Data pipeline design patterns" blog.☆43Updated 6 months ago
- Delta Lake Documentation☆48Updated 8 months ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆50Updated last year
- A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from loc…☆21Updated 2 years ago