arpit-mittal-ds / Data-Architect
plan, design and implement enterprise data infrastructure solutions and create the blueprints for an organization’s data management system. You’ll create a relational database with PostGreSQL, design an Online Analytical Processing (OLAP) data model to build a cloud based data warehouse, and design scalable data lake architecture that meets the …
☆11Updated last year
Alternatives and similar repositories for Data-Architect:
Users that are interested in Data-Architect are comparing it to the libraries listed below
- Simple ETL pipeline using Python☆25Updated last year
- 😈Complete End to End ETL Pipeline with Spark, Airflow, & AWS☆43Updated 5 years ago
- Developed an ETL pipeline for a Data Lake that extracts data from S3, processes the data using Spark, and loads the data back into S3 as …☆16Updated 5 years ago
- Solution to all projects of Udacity's Data Engineering Nanodegree: Data Modeling with Postgres & Cassandra, Data Warehouse with Redshift,…☆56Updated 2 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆53Updated last year
- Classwork projects and home works done through Udacity data engineering nano degree☆74Updated last year
- PySpark Cheatsheet☆36Updated 2 years ago
- All my projects on Big Data are provided☆27Updated 8 years ago
- Data engineering interviews Q&A for data community by data community☆64Updated 4 years ago
- Apache Spark using SQL☆14Updated 3 years ago
- ☆19Updated 6 years ago
- Projects done in the Data Engineer Nanodegree Program by Udacity.com☆107Updated 2 years ago
- This repo is mostly created for pyspark and hive related interview questions.☆47Updated 3 years ago
- ☆87Updated 2 years ago
- Repository related to Spark SQL and Pyspark using Python3☆37Updated 2 years ago
- Example repo to create end to end tests for data pipeline.☆22Updated 8 months ago
- Data Engineering Capstone Project: ETL Pipelines and Data Warehouse Development☆21Updated 5 years ago
- Developed a data pipeline to automate data warehouse ETL by building custom airflow operators that handle the extraction, transformation,…☆90Updated 3 years ago
- Lecture notes, lab notes, and links to helpful resources to pass Google Certification Exam for Professional Data Engineer.☆17Updated 2 years ago
- ☆41Updated 7 months ago
- ☆21Updated last year
- Data Quest - Data Engineer Learning and Projects☆24Updated 5 years ago
- This repository hosts the code/projects/demos/slides for Big Data technologies under Apache Hadoop and Apache Spark umbrella.☆42Updated 2 years ago
- Big Data Engineering practice project, including ETL with Airflow and Spark using AWS S3 and EMR☆80Updated 5 years ago
- A collection of data engineering projects: data modeling, ETL pipelines, data lakes, infrastructure configuration on AWS, data warehousin…☆14Updated 3 years ago
- ETL pipeline using pyspark (Spark - Python)☆113Updated 4 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆76Updated 5 years ago
- A batch processing data pipeline, using AWS resources (S3, EMR, Redshift, EC2, IAM), provisioned via Terraform, and orchestrated from loc…☆21Updated 2 years ago
- data engineering 100 days 🤖 🧲 🦾 | #DE☆40Updated last year
- A repo to track data engineering projects☆13Updated 2 years ago