mikeroyal / Apache-Spark-GuideLinks
Apache Spark Guide
☆35Updated 4 years ago
Alternatives and similar repositories for Apache-Spark-Guide
Users that are interested in Apache-Spark-Guide are comparing it to the libraries listed below
Sorting:
- Full stack data engineering tools and infrastructure set-up☆57Updated 4 years ago
- Simplified ETL process in Hadoop using Apache Spark. Has complete ETL pipeline for datalake. SparkSession extensions, DataFrame validatio…☆56Updated 2 years ago
- Template for Data Engineering and Data Pipeline projects☆116Updated 3 years ago
- A full data warehouse infrastructure with ETL pipelines running inside docker on Apache Airflow for data orchestration, AWS Redshift for …☆141Updated 5 years ago
- Data pipeline performing ETL to AWS Redshift using Spark, orchestrated with Apache Airflow☆162Updated 5 years ago
- New generation opensource data stack☆76Updated 3 years ago
- A repository of sample code to show data quality checking best practices using Airflow.☆78Updated 2 years ago
- PySpark functions and utilities with examples. Assists ETL process of data modeling☆104Updated 5 years ago
- Delta Lake Documentation☆53Updated last year
- Simple stream processing pipeline☆110Updated last year
- Code for my "Efficient Data Processing in SQL" book.☆60Updated last year
- Resources for video demonstrations and blog posts related to DataOps on AWS☆183Updated 4 years ago
- Data engineering interviews Q&A for data community by data community☆66Updated 5 years ago
- A curated list of resources about Snowflake☆256Updated last year
- Airflow training for the crunch conf☆105Updated 7 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆168Updated 2 years ago
- Code for "Advanced data transformations in SQL" free live workshop☆89Updated 9 months ago
- A CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects☆225Updated 9 months ago
- A modern ELT demo using airbyte, dbt, snowflake and dagster☆28Updated 3 years ago
- ☆60Updated last year
- Project files for the post: Running PySpark Applications on Amazon EMR using Apache Airflow: Using the new Amazon Managed Workflows for A…☆41Updated 3 years ago
- This project helps me to understand the core concepts of Apache Airflow. I have created custom operators to perform tasks such as staging…☆97Updated 6 years ago
- A book describing how to set up and maintain Data Engineering infrastructure using Google Cloud Platform.☆126Updated 4 years ago
- ETL pipeline using pyspark (Spark - Python)☆116Updated 5 years ago
- Cloned by the `dbt init` task☆62Updated last year
- Big Data Demystified meetup and blog examples☆31Updated last year
- Open Data Stack Projects: Examples of End to End Data Engineering Projects☆91Updated 2 years ago
- Weekly Data Engineering Newsletter☆96Updated last year
- (project & tutorial) dag pipeline tests + ci/cd setup☆90Updated 4 years ago
- ☆88Updated 3 years ago