I use spark 1.1.0-SNAPSHOT and the test I'm running is in local mode. My test 
case uses org.apache.spark.streaming.TestSuiteBase

val spark="org.apache.spark" %% "spark-core" % "1.1.0-SNAPSHOT" % "provided" 
excludeAll( 
val sparkStreaming= "org.apache.spark" % "spark-streaming_2.10" % 
"1.1.0-SNAPSHOT" % "provided" excludeAll(
val sparkCassandra= "com.tuplejump" % "calliope_2.10" % "0.9.0-C2-EA" 
exclude("org.apache.cassandra", "cassandra-all") 
exclude("org.apache.cassandra", "cassandra-thrift")
val casAll = "org.apache.cassandra" % "cassandra-all" % "2.0.3" intransitive()
val casThrift = "org.apache.cassandra" % "cassandra-thrift" % "2.0.3" 
intransitive()
val sparkStreamingFromKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" 
% "0.9.1" excludeAll(


-----Original Message-----
From: Sean Owen [mailto:so...@cloudera.com]
Sent: January-22-15 11:39 AM
To: Adrian Mocanu
Cc: u...@spark.incubator.apache.org
Subject: Re: Exception: NoSuchMethodError: 
org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions

NoSuchMethodError almost always means that you have compiled some code against 
one version of a library but are running against another. I wonder if you are 
including different versions of Spark in your project, or running against a 
cluster on an older version?

On Thu, Jan 22, 2015 at 3:57 PM, Adrian Mocanu <amoc...@verticalscope.com> 
wrote:
> Hi
>
> I get this exception when I run a Spark test case on my local machine:
>
>
>
> An exception or error caused a run to abort:
> org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions(Lo
> rg/apache/spark/streaming/dstream/DStream;Lscala/reflect/ClassTag;Lsca
> la/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/streaming/
> dstream/PairDStreamFunctions;
>
> java.lang.NoSuchMethodError:
> org.apache.spark.streaming.StreamingContext$.toPairDStreamFunctions(Lo
> rg/apache/spark/streaming/dstream/DStream;Lscala/reflect/ClassTag;Lsca
> la/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/streaming/
> dstream/PairDStreamFunctions;
>
>
>
> In my test case I have these Spark related imports imports:
>
> import org.apache.spark.streaming.StreamingContext._
>
> import org.apache.spark.streaming.TestSuiteBase
>
> import org.apache.spark.streaming.dstream.DStream
>
> import
> org.apache.spark.streaming.StreamingContext.toPairDStreamFunctions
>
>
>
> -Adrian
>
>
B KKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKCB  [  
X  ܚX KK[XZ[
 \ \ ][  X  ܚX P \ ˘\X K ܙ B  ܈Y][ۘ[  [X[  K[XZ[
 \ \ Z[ \ ˘\X K ܙ B B

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to