chayansraj / Data-Pipeline-with-dbt-using-Airflow-on-GCP
This project demonstrates how to build and automate an ETL pipeline using DAGs in Airflow and load the transformed data to Bigquery. There are different tools that have been used in this project such as Astro, DBT, GCP, Airflow, Metabase.
☆23Updated this week
Alternatives and similar repositories for Data-Pipeline-with-dbt-using-Airflow-on-GCP:
Users that are interested in Data-Pipeline-with-dbt-using-Airflow-on-GCP are comparing it to the libraries listed below
- build dw with dbt☆44Updated 6 months ago
- End-to-end data platform leveraging the Modern data stack☆47Updated last year
- Cloned by the `dbt init` task☆61Updated 11 months ago
- Data lake, data warehouse on GCP☆56Updated 3 years ago
- Code snippets for Data Engineering Design Patterns book☆80Updated last month
- Execution of DBT models using Apache Airflow through Docker Compose☆116Updated 2 years ago
- Code for dbt tutorial☆156Updated 10 months ago
- 🥪🦘 An open source sandbox project exploring dbt workflows via a fictional sandwich shop's data.☆156Updated last week
- Data engineering with dbt, published by Packt☆77Updated last year
- ☆128Updated 2 months ago
- Data Engineering examples for Airflow, Prefect; dbt for BigQuery, Redshift, ClickHouse, Postgres, DuckDB; PySpark for Batch processing; K…☆65Updated 2 months ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆137Updated 9 months ago
- Resources for video demonstrations and blog posts related to DataOps on AWS☆175Updated 3 years ago
- Data Engineering with Google Cloud Platform, published by Packt☆116Updated last year
- End to end data engineering project