[ https://issues.apache.org/jira/browse/SPARK-5784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14575788#comment-14575788 ]
Ryan Williams commented on SPARK-5784: -------------------------------------- Hi [~varvind], the verdict here was that this functionality should live outside of the Spark repository. Luckily, it's still very easy to incorporate such external functionality into your Spark applications at runtime: * Build a standalone JAR yourself that includes the StatsDSink class. * Put that JAR on your Spark driver's classpath at runtime (cf. [the {{--driver-class-path}} argument|https://spark.apache.org/docs/1.3.1/configuration.html#runtime-environment]). * Register it in your {{metrics.properties}} file the way you would if it lived in the Spark repository itself. If you additionally publish the standalone functionality/class in the [Spark packages|http://spark-packages.org/] index, others will be able to easily find and use it as well. Let me know if that doesn't make sense. > Add StatsDSink to MetricsSystem > ------------------------------- > > Key: SPARK-5784 > URL: https://issues.apache.org/jira/browse/SPARK-5784 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.2.1 > Reporter: Ryan Williams > Priority: Minor > Attachments: statsd.patch > > > [StatsD|https://github.com/etsy/statsd/] is a common wrapper for Graphite; it > would be useful to support sending metrics to StatsD in addition to [the > existing Graphite > support|https://github.com/apache/spark/blob/6a1be026cf37e4c8bf39133dfb4a73f7caedcc26/core/src/main/scala/org/apache/spark/metrics/sink/GraphiteSink.scala]. > [readytalk/metrics-statsd|https://github.com/readytalk/metrics-statsd] is a > StatsD adapter for the > [dropwizard/metrics|https://github.com/dropwizard/metrics] library that Spark > uses. The Maven repository at http://dl.bintray.com/readytalk/maven/ serves > {{metrics-statsd}}. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org