These are Spark and Scala versions used by ignite-spark 1.9 module by default:

<dependency>
  <groupId>org.scala-lang</groupId>
  <artifactId>scala-library</artifactId>
  <version>2.11.8</version>
  <scope>compile</scope>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>2.1.0</version>
  <scope>compile</scope>
</dependency>
They are identical to yours.

How do you run the example? What’s your IDE (IntellijIdea, Eclipse, etc)?

Just in case try to clean out local maven repo by removing .m2 folder from your 
user directory.

—
Denis


> On Mar 23, 2017, at 5:22 AM, Jörn Franke <jornfra...@gmail.com> wrote:
> 
> Looks like it cannot find the scala library and/or wrong scala version is 
> available to the application.
> 
>> On 23 Mar 2017, at 09:08, Purushotham Muthuluru <pmuthul...@me.com> wrote:
>> 
>> Hi ,
>> 
>> I am using Spark 2.1.0 with Ignite v1.9 , I get following error when I try 
>> to run a example, I do not have scala installed on my mac I use the one that 
>> is embedded in Spark 2.1.0 which is Scala 2.11.8
>> 
>> What version of spark is compatible with Ignite v1.9?
>> 
>> I am doing an evaluation of ignite for one of our project, quick response is 
>> much appreciated.
>> 
>> ERROR
>> ----------
>> [00:49:26] 
>> [00:49:26] Ignite node started OK (id=17984d93)
>> [00:49:26] Topology snapshot [ver=1, servers=1, clients=0, CPUs=8, 
>> heap=0.89GB]
>>>>> Transforming values stored in Ignite Shared RDD...                        
>>>>>   
>> 17/03/23 00:49:27 ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 11)
>> java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
>>   at 
>> org.apache.ignite.spark.impl.IgniteQueryIterator.<init>(IgniteQueryIterator.scala:20)
>>   at org.apache.ignite.spark.IgniteRDD.compute(IgniteRDD.scala:67)
>>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>   at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>>   at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>   at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>   at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>   at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>   at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ClassNotFoundException: 
>> scala.collection.GenTraversableOnce$class
>>   at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>   ... 16 more
>> 
>> Thanks,
>> Puru.
>> 

Reply via email to