laughingman7743 / PyAthenaJDBC
PyAthenaJDBC is an Amazon Athena JDBC driver wrapper for the Python DB API 2.0 (PEP 249).
☆95Updated last year
Alternatives and similar repositories for PyAthenaJDBC:
Users that are interested in PyAthenaJDBC are comparing it to the libraries listed below
- Export Redshift data and convert to Parquet for use with Redshift Spectrum or other data warehouses.☆116Updated 2 years ago
- Presto-like CLI tool for AWS Athena☆84Updated 2 years ago
- Cloudformation templates for deploying Airflow in ECS☆40Updated 6 years ago
- Amazon Redshift SQLAlchemy Dialect☆222Updated 9 months ago
- Start a cluster in EC2 for dask.distributed☆106Updated 4 years ago
- Low level, multiprocessing based AWS Kinesis producer & consumer library☆122Updated 11 months ago
- DataPipeline for humans.☆251Updated 2 years ago
- Required packages for using pandas in AWS Lambda functions☆45Updated 8 years ago
- PyAthena is a Python DB API 2.0 (PEP 249) client for Amazon Athena.☆472Updated 3 months ago
- Convert JSON files to Parquet using PyArrow☆97Updated last year
- Amazon Kinesis Client Library for Python☆373Updated 3 weeks ago
- SQL on dataframes - pandas and dask☆64Updated 6 years ago
- This code demonstrates the architecture featured on the AWS Big Data blog (https://aws.amazon.com/blogs/big-data/ ) which creates a concu…☆75Updated 6 years ago
- Airflow plugin to transfer arbitrary files between operators☆78Updated 6 years ago
- Turbine: the bare metals that gets you Airflow☆378Updated 3 years ago
- Example for an airflow plugin☆49Updated 8 years ago
- SQL for many helpful Redshift UDFs, and the scripts for generating and testing those UDFs☆125Updated 6 years ago
- Apache Spark on AWS Lambda☆151Updated 2 years ago
- Amazon Redshift Advanced Monitoring☆272Updated 2 years ago
- CLI tool to launch Spark jobs on AWS EMR☆67Updated last year
- Arbalest is a Python data pipeline orchestration library for Amazon S3 and Amazon Redshift. It automates data import into Redshift and ma…☆41Updated 9 years ago
- Example unit tests for Apache Spark Python scripts using the py.test framework☆84Updated 9 years ago
- Quickly get a kubernetes executor airflow environment provisioned on GKE. Azure Kubernetes Service instructions included also as are inst…☆36Updated 4 years ago
- Utils around luigi.☆66Updated 4 years ago
- REST-like API exposing Airflow data and operations☆61Updated 6 years ago
- Create Parquet files from CSV☆67Updated 7 years ago
- Build the numpy/scipy/scikitlearn packages and strip them down to run in Lambda☆207Updated 6 years ago
- python implementation of the parquet columnar file format.☆350Updated 3 years ago
- Simple multi-threaded Kinesis Poster and Worker Python examples☆69Updated 9 years ago
- A pure Python implementation of Apache Spark's RDD and DStream interfaces.☆268Updated 7 months ago