[ 
https://issues.apache.org/jira/browse/SPARK-5489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14299001#comment-14299001
 ] 

DeepakVohra commented on SPARK-5489:
------------------------------------

Already did before posting the previous message and the jar does have the 
classes, but are indicated as not found with the Maven dependency. Gets fixed 
with MLLib 2.11. The Maven dependency MLlib 2.10 has some issue.

> KMeans clustering java.lang.NoSuchMethodError: scala.runtime.IntRef.create  
> (I)Lscala/runtime/IntRef;
> -----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5489
>                 URL: https://issues.apache.org/jira/browse/SPARK-5489
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.2.0
>         Environment: Spark 1.2 
> Maven
>            Reporter: DeepakVohra
>
> The KMeans clustering generates following error, which also seems to be due 
> version mismatch between Scala used for compiling Spark and Scala in Spark 
> 1.2 Maven dependency. 
> Exception in thread "main" java.lang.NoSuchMethodError: 
> scala.runtime.IntRef.create
> (I)Lscala/runtime/IntRef;
>       at 
> org.apache.spark.mllib.clustering.KMeans.initKMeansParallel(KMeans.scala:282)
>       at 
> org.apache.spark.mllib.clustering.KMeans.runAlgorithm(KMeans.scala:155)
>       at 
> org.apache.spark.mllib.clustering.KMeans.run(KMeans.scala:132)
>       at 
> org.apache.spark.mllib.clustering.KMeans$.train(KMeans.scala:352)
>       at 
> org.apache.spark.mllib.clustering.KMeans$.train(KMeans.scala:362)
>       at 
> org.apache.spark.mllib.clustering.KMeans.train(KMeans.scala)
>       at 
> clusterer.kmeans.KMeansClusterer.main(KMeansClusterer.java:35)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to