airscholar / modern-data-eng-dbt-databricks-azure
In this project, we setup and end to end data engineering using Apache Spark, Azure Databricks, Data Build Tool (DBT) using Azure as our cloud provider.
☆27Updated last year
Alternatives and similar repositories for modern-data-eng-dbt-databricks-azure:
Users that are interested in modern-data-eng-dbt-databricks-azure are comparing it to the libraries listed below
- This repository contains the code for a realtime election voting system. The system is built using Python, Kafka, Spark Streaming, Postgr…☆35Updated last year
- End to end data engineering project☆54Updated 2 years ago
- ☆40Updated 9 months ago
- This repo contains "Databricks Certified Data Engineer Professional" Questions and related docs.☆65Updated 8 months ago
- Code for blog at https://www.startdataengineering.com/post/python-for-de/☆73Updated 10 months ago
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆64Updated last month
- Simple ETL pipeline using Python☆26Updated last year
- This repository will help you to learn about databricks concept with the help of examples. It will include all the important topics which…☆97Updated 8 months ago
- End to end data engineering project with kafka, airflow, spark, postgres and docker.☆90Updated 2 weeks ago
- Ultimate guide for mastering Spark Performance Tuning and Optimization concepts and for preparing for Data Engineering interviews☆118Updated 10 months ago
- ☆28Updated last year
- Git Repository☆139Updated 2 months ago
- Series follows learning from Apache Spark (PySpark) with quick tips and workaround for daily problems in hand☆48Updated last year
- This repository contains the necessary configuration files and DAGs (Directed Acyclic Graphs) for setting up a robust data engineering en…☆18Updated last year
- Code for "Advanced data transformations in SQL" free live workshop☆75Updated 5 months ago
- Stream processing with Azure Databricks☆139Updated 4 months ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆142Updated 4 years ago
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆41Updated last year
- ☆30Updated 3 months ago
- ☆38Updated 2 years ago
- Sample project to demonstrate data engineering best practices☆184Updated last year
- ☆33Updated last year
- ☆87Updated 2 years ago
- YouTube tutorial project☆102Updated last year
- build dw with dbt☆43Updated 5 months ago
- Step by step instructions to create a production-ready data pipeline☆44Updated 3 months ago
- This project leverages GCS, Composer, Dataflow, BigQuery, and Looker on Google Cloud Platform (GCP) to build a robust data engineering so…☆22Updated last year
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 8 months ago
- A custom end-to-end analytics platform for customer churn☆11Updated 2 months ago
- Writes the CSV file to Postgres, read table and modify it. Write more tables to Postgres with Airflow.☆35Updated last year