nerdammer / spark-additions
Utilities for Apache Spark
☆34Updated 9 years ago
Alternatives and similar repositories for spark-additions:
Users that are interested in spark-additions are comparing it to the libraries listed below
- A framework for creating composable and pluggable data processing pipelines using Apache Spark, and running them on a cluster.☆47Updated 8 years ago
- Big Data Toolkit for the JVM☆146Updated 4 years ago
- Enabling Spark Optimization through Cross-stack Monitoring and Visualization☆47Updated 7 years ago
- A library to expose more of Apache Spark's metrics system☆146Updated 5 years ago
- Bulletproof Apache Spark jobs with fast root cause analysis of failures.☆72Updated 4 years ago
- ☆92Updated 7 years ago
- Scripts for parsing / making sense of yarn logs☆52Updated 8 years ago
- Low level integration of Spark and Kafka☆130Updated 7 years ago
- Waimak is an open-source framework that makes it easier to create complex data flows in Apache Spark.☆75Updated 11 months ago
- File compaction tool that runs on top of the Spark framework.☆59Updated 5 years ago
- type-class based data cleansing library for Apache Spark SQL☆78Updated 5 years ago
- The iterative broadcast join example code.☆69Updated 7 years ago
- Apache Spark and Apache Kafka integration example☆124Updated 7 years ago
- An example of using Avro and Parquet in Spark SQL☆60Updated 9 years ago
- Custom state store providers for Apache Spark☆92Updated 2 months ago
- Hadoop output committers for S3☆109Updated 4 years ago
- Support Highcharts in Apache Zeppelin☆81Updated 7 years ago
- Schedoscope is a scheduling framework for painfree agile development, testing, (re)loading, and monitoring of your datahub, lake, or what…☆95Updated 5 years ago
- ScalaCheck for Spark☆63Updated 7 years ago
- Secondary sort and streaming reduce for Apache Spark☆78Updated last year
- sbt plugin for spark-submit☆96Updated 7 years ago
- Scala + Druid: Scruid. A library that allows you to compose queries in Scala, and parse the result back into typesafe classes.☆115Updated 3 years ago
- A library you can include in your Spark job to validate the counters and perform operations on success. Goal is scala/java/python support…☆109Updated 7 years ago
- functionstest☆33Updated 8 years ago
- How to use Parquet in Flink☆32Updated 7 years ago
- A Spark WordCountJob example as a standalone SBT project with Specs2 tests, runnable on Amazon EMR☆118Updated 9 years ago
- Example integration of Kafka, Avro & Spark-Streaming on live Twitter feed☆23Updated 10 years ago
- Elastic Search on Spark☆112Updated 10 years ago
- Spark connector for SFTP☆100Updated 2 years ago
- Write your Spark data to Kafka seamlessly☆174Updated 9 months ago