[ 
https://issues.apache.org/jira/browse/SPARK-5489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14297937#comment-14297937
 ] 

DeepakVohra commented on SPARK-5489:
------------------------------------

Sean,

Made the Scala version the same, but still getting the error.
"For the Scala API, Spark 1.2.0 uses Scala 2.10. "
http://spark.apache.org/docs/1.2.0/

Made Maven dependencies Scala version also 2.10.
<dependencies>
                <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-core_2.10</artifactId>
                        <version>1.2.0</version>
                        <exclusions>
                                <exclusion>
                                        <groupId>org.scala-lang</groupId>
                                        <artifactId>scala-library</artifactId>
                                </exclusion>
                                <exclusion>
                                        <groupId>org.scala-lang</groupId>
                                        <artifactId>scala-compiler</artifactId>
                                </exclusion>
                        </exclusions>
                </dependency>

                <dependency>
                        <groupId>org.apache.spark</groupId>
                        <artifactId>spark-mllib_2.11</artifactId>
                        <version>1.2.0</version>
                        <exclusions>
                                <exclusion>
                                        <groupId>org.scala-lang</groupId>
                                        <artifactId>scala-library</artifactId>
                                </exclusion>
                                <exclusion>
                                        <groupId>org.scala-lang</groupId>
                                        <artifactId>scala-compiler</artifactId>
                                </exclusion>
                        </exclusions>
                </dependency>
                <dependency>
                        <groupId>org.scala-lang</groupId>
                        <artifactId>scala-library</artifactId>
                        <version>2.10.0</version>
                </dependency>
                
                        <dependency>
                        <groupId>org.scala-lang</groupId>
                        <artifactId>scala-compiler</artifactId>
                        <version>2.10.0</version>
                </dependency>
        
        </dependencies>

thanks,
Deepak

> KMeans clustering java.lang.NoSuchMethodError: scala.runtime.IntRef.create  
> (I)Lscala/runtime/IntRef;
> -----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5489
>                 URL: https://issues.apache.org/jira/browse/SPARK-5489
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.2.0
>         Environment: Spark 1.2 
> Maven
>            Reporter: DeepakVohra
>
> The KMeans clustering generates following error, which also seems to be due 
> version mismatch between Scala used for compiling Spark and Scala in Spark 
> 1.2 Maven dependency. 
> Exception in thread "main" java.lang.NoSuchMethodError: 
> scala.runtime.IntRef.create
> (I)Lscala/runtime/IntRef;
>       at 
> org.apache.spark.mllib.clustering.KMeans.initKMeansParallel(KMeans.scala:282)
>       at 
> org.apache.spark.mllib.clustering.KMeans.runAlgorithm(KMeans.scala:155)
>       at 
> org.apache.spark.mllib.clustering.KMeans.run(KMeans.scala:132)
>       at 
> org.apache.spark.mllib.clustering.KMeans$.train(KMeans.scala:352)
>       at 
> org.apache.spark.mllib.clustering.KMeans$.train(KMeans.scala:362)
>       at 
> org.apache.spark.mllib.clustering.KMeans.train(KMeans.scala)
>       at 
> clusterer.kmeans.KMeansClusterer.main(KMeansClusterer.java:35)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to