redapt / pyspark-s3-parquet-exampleLinks
This repo demonstrates how to load a sample Parquet formatted file from an AWS S3 Bucket. A python job will then be submitted to a Apache Spark instance running on AWS EMR, which will run a SQLContext to create a temporary table using a DataFrame. SQL queries will then be possible against the temporary table.
☆19Updated 9 years ago
Alternatives and similar repositories for pyspark-s3-parquet-example
Users that are interested in pyspark-s3-parquet-example are comparing it to the libraries listed below
Sorting:
- Sentiment Analysis of a Twitter Topic with Spark Structured Streaming☆55Updated 6 years ago
- Repo for all my code on the articles I post on medium☆107Updated 3 years ago
- scaffold of Apache Airflow executing Docker containers☆85Updated 3 years ago
- Udacity Data Pipeline Exercises☆15Updated 5 years ago
- AWS Big Data Certification☆25Updated 11 months ago
- Code supporting Data Science articles at The Marketing Technologist, Floryn Tech Blog, and Pythom.nl☆71Updated 2 years ago
- Basic tutorial of using Apache Airflow☆36Updated 7 years ago
- Public source code for the Batch Processing with Apache Beam (Python) online course☆18Updated 5 years ago
- This workshop demonstrates two methods of machine learning inference for global production using AWS Lambda and Amazon SageMaker