Hi,

I am seeing the following error message when I began testing my Streaming
application locally. Could it be due to a mismatch with
old spark jars somewhere or is this something else?

Thanks,
Ashish

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/Users/myproject/lib/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/Users/myproject/lib/spark-tools-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger
(akka.event.slf4j.Slf4jEventHandler).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Exception in thread "main" java.lang.ClassCastException:
[Ljava.lang.Object; cannot be cast to [Lscala.Tuple2;
at
org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:79)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:49)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:122)
at
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:586)
at
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
:
:

Reply via email to