rogeriomm / labtools-k8s
Complete data engineering pipeline running on Minikube Kubernetes, Argo CD, Spark, Trino, S3, Delta lake, Postgres+ Debezium CDC, MySQL,Airflow, Kafka Strimzi, Datahub, OpenMetadata,Zeppelin, Jupyter, JFrog Container Registry
☆24Updated 7 months ago
Related projects ⓘ
Alternatives and complementary repositories for labtools-k8s
- Spark development environment for kubernetes, spark-submit and jupyter notebook☆19Updated 2 years ago
- Data Engineering com Apache Spark☆43Updated 3 years ago
- ☆15Updated 7 months ago
- A repository of sample code to show data quality checking best practices using Airflow.☆72Updated last year
- ☆22Updated 2 years ago
- ☆21Updated last year
- Library to convert DBT manifest metadata to Airflow tasks☆46Updated 8 months ago
- ☆23Updated 11 months ago
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆46Updated last month
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆224Updated 3 weeks ago
- Soda Spark is a PySpark library that helps you with testing your data in Spark Dataframes☆63Updated 2 years ago
- This is an ETL application on AWS with general open sales and customer data that you can find here: https://github.com/camposvinicius/dat…☆17Updated 2 years ago
- ☆58Updated 8 months ago
- Delta-Lake, ETL, Spark, Airflow☆44Updated 2 years ago
- Simple repo to demonstrate how to submit a spark job to EMR from Airflow☆32Updated 4 years ago
- ☆8Updated last month
- Delta Lake examples☆207Updated last month
- This repo provides the Kubernetes Helm chart for deploying Pyspark Notebook.☆17Updated 2 years ago
- ☆104Updated 3 months ago
- A repository of sample code to accompany our blog post on Airflow and dbt.☆167Updated last year
- To provide a deeper understanding of how the modern, open-source data stack consisting of Iceberg, dbt, Trino, and Hive operates within a…☆26Updated 8 months ago
- Playground for Lakehouse (Iceberg, Hudi, Spark, Flink, Trino, DBT, Airflow, Kafka, Debezium CDC)☆44Updated last year
- Execution of DBT models using Apache Airflow through Docker Compose☆113Updated last year
- This is a pipeline of an ETL application in GCP with open airport code data, which you can find here: https://datahub.io/core/airport-cod…☆14Updated 3 years ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆112Updated 4 months ago
- A collection of Airflow operators, hooks, and utilities to elevate dbt to a first-class citizen of Airflow.☆181Updated 4 months ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆56Updated last year
- ☆37Updated 3 months ago