I am really interested in using Spark from R and have tried to use SparkR,
but always get the same error.

 

This is how I installed:

 

 - I successfully installed Spark version  0.9.0 with Scala  2.10.3 (OpenJDK
64-Bit Server VM, Java 1.7.0_45)

   I can run examples from spark-shell and Python

 

 - I installed the R package devtools and installed SparkR using:

 

 - library(devtools)

 - install_github("amplab-extras/SparkR-pkg", subdir="pkg")

 

  This compiled the package successfully.

  

When I try to run the package

 

E.g., 

  library(SparkR)

  sc <- sparkR.init(master="local")           //- so far the program runs
fine

                 

  rdd <- parallelize(sc, 1:10)  // This returns the following error

                

  Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : 

  java.lang.IncompatibleClassChangeError:
org/apache/spark/util/InnerClosureFinder

 

No matter how I try to use the sc (I have tried all the examples) I always
get an error.

 

Any ideas?

 

Jacques.

Reply via email to