Hello, Ray. Currently, There is no plans to support Spark 1.6 in Ignite. I doubt if it can be done without significat changes in existing code base.
Anyway, you can create a ticket [1]. And I will try to look what can be done. [1] https://issues.apache.org/jira/browse/IGNITE В Пн, 19/03/2018 в 01:27 -0700, Ray пишет: > I'm trying to save spark dataframe to Ignite 2.4 using Apache Spark 1.6. > But it failed, with the following error > Exception in thread "main" java.util.ServiceConfigurationError: > org.apache.spark.sql.sources.DataSourceRegister: Provider > org.apache.ignite.spark.impl.IgniteRelationProvider could not be > instantiated > at java.util.ServiceLoader.fail(ServiceLoader.java:232) > at java.util.ServiceLoader.access$100(ServiceLoader.java:185) > at > java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384) > at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) > at java.util.ServiceLoader$1.next(ServiceLoader.java:480) > at > scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:42) > at scala.collection.Iterator$class.foreach(Iterator.scala:727) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) > at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) > at scala.collection.AbstractIterable.foreach(Iterable.scala:54) > at > scala.collection.TraversableLike$class.filter(TraversableLike.scala:263) > at scala.collection.AbstractTraversable.filter(Traversable.scala:105) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:59) > at > org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:109) > at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:244) > at > IgniteDataFrameWriteExample$.main(IgniteDataFrameWriteExample.scala:40) > at IgniteDataFrameWriteExample.main(IgniteDataFrameWriteExample.scala) > Caused by: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:763) > at > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) > at java.net.URLClassLoader.access$100(URLClassLoader.java:73) > at java.net.URLClassLoader$1.run(URLClassLoader.java:368) > at java.net.URLClassLoader$1.run(URLClassLoader.java:362) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:361) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > at java.lang.Class.getDeclaredConstructors0(Native Method) > at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671) > at java.lang.Class.getConstructor0(Class.java:3075) > at java.lang.Class.newInstance(Class.java:412) > at > java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380) > ... 16 more > Caused by: java.lang.ClassNotFoundException: > org.apache.spark.internal.Logging > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > ... 33 more > > But it works fine under Spark 2.2. > So I'm wondering will Spark dataframe feature supports Spark 1.6 in the > future? > I can't upgrade to Spark 2.2 because Cloudera won't upgrade. > > > > > -- > Sent from: http://apache-ignite-users.70518.x6.nabble.com/
signature.asc
Description: This is a digitally signed message part