redapt / pyspark-s3-parquet-exampleLinks
This repo demonstrates how to load a sample Parquet formatted file from an AWS S3 Bucket. A python job will then be submitted to a Apache Spark instance running on AWS EMR, which will run a SQLContext to create a temporary table using a DataFrame. SQL queries will then be possible against the temporary table.
☆19Updated 9 years ago
Alternatives and similar repositories for pyspark-s3-parquet-example
Users that are interested in pyspark-s3-parquet-example are comparing it to the libraries listed below
Sorting:
- Airflow workflow management platform chef cookbook.☆71Updated 6 years ago
- scaffold of Apache Airflow executing Docker containers☆86Updated 2 years ago
- ☆17Updated 6 years ago
- The open source version of the Amazon Athena documentation. To submit feedback & requests for changes, submit issues in this repository, …☆83Updated 2 years ago
- Basic tutorial of using Apache Airflow☆36Updated 6 years ago
- Ingest tweets with Kafka. Use Spark to track popular hashtags and trendsetters for each hashtag☆29Updated 9 years ago
- Convert JSON files to Parquet using PyArrow☆97Updated last year
- This service is meant to simplify running Google Cloud operations, especially BigQuery tasks. This means you do not have to worry about …☆45Updated 6 years ago
- Sentiment Analysis of a Twitter Topic with Spark Structured Streaming☆55Updated 6 years ago
- CLI tool to launch Spark jobs on AWS EMR☆67Updated last year
- Just a boilerplate for PySpark and Flask☆35Updated 7 years ago
- Composable filesystem hooks and operators for Apache Airflow.☆17Updated 4 years ago
- Docker compose files for various kafka stacks☆32Updated 7 years ago
- 🚨 Simple, self-contained fraud detection system built with Apache Kafka and Python☆88Updated 6 years ago
- Scripts and instructions to facilitate running Deep Learning Tasks on Amazon EMR☆63Updated last year
- Code that goes along with https://humansofdata.atlan.com/2018/06/apache-airflow-disease-outbreaks-india/☆24Updated 2 years ago
- Docker image to submit Spark applications☆38Updated 7 years ago
- Functional Airflow DAG definitions.☆38Updated 8 years ago
- A toolset to streamline running spark python on EMR☆20Updated 8 years ago
- A luigi powered analytics / warehouse stack☆88Updated 8 years ago
- Repo for all my code on the articles I post on medium☆107Updated 2 years ago
- A curated list of all the awesome examples, articles, tutorials and videos for Apache Airflow.☆96Updated 4 years ago
- An example PySpark project with pytest☆16Updated 7 years ago
- Docker container for Kafka - Spark Streaming - Cassandra☆98Updated 6 years ago
- Chatlytics is a data query and visualization platform for chat!☆13Updated 8 years ago
- pysh-db - The Data Science Toolkit (DSK)☆13Updated 6 years ago
- Code supporting Data Science articles at The Marketing Technologist, Floryn Tech Blog, and Pythom.nl☆71Updated 2 years ago
- Using Luigi to create a Machine Learning Pipeline using the Rossman Sales data from Kaggle☆33Updated 9 years ago
- Common API for all "second gen" AutoML APIs: Auger.AI, Google Cloud AutoML and Azure AutoML☆41Updated 8 months ago
- Small Docker image with Python Machine Learning tools (~180MB) https://hub.docker.com/r/frolvlad/alpine-python-machinelearning/☆81Updated 4 months ago