There is a Datastax Spark connector library jar file that you probably have
on your CLASSPATH locally, but not on the cluster. If you know where it is,
you could either install it on each node in some location on their
CLASSPATHs or when you submit the mob, pass the jar file using the "--jars"
option. Note that the latter may not be an ideal solution if it has other
dependencies that also need to be passed.

dean

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Wed, Oct 14, 2015 at 5:05 PM, Renato Perini <renato.per...@gmail.com>
wrote:

> Hello.
> I have developed a Spark job using a jersey client (1.9 included with
> Spark) to make some service calls during data computations.
> Data is read and written on an Apache Cassandra 2.2.1 database.
> When I run the job in local mode, everything works nicely. But when I
> execute my job in cluster mode (spark standalone) I receive the following
> exception:
> I have no clue on where this exception occurs. Any idea / advice on what
> can I check?
>
> [Stage 38:==>                                                     (8 + 2)
> / 200]15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception
> could
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 4.0 in stage 38.0 (TID
> 1431, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 9.0 in stage 38.0 (TID
> 1436, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 12.0 in stage 38.0 (TID
> 1439, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 9.1 in stage 38.0 (TID
> 1441, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 12.1 in stage 38.0 (TID
> 1442, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 4.1 in stage 38.0 (TID
> 1443, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 9.2 in stage 38.0 (TID
> 1444, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 12.2 in stage 38.0 (TID
> 1445, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 4.2 in stage 38.0 (TID
> 1446, 172.31.18.215): UnknownReason
> [Stage 38:===>                                                   (11 + 3)
> / 200]15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception
> could
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 9.3 in stage 38.0 (TID
> 1447, 172.31.18.215): UnknownReason
> 15/10/14 15:54:07 ERROR TaskSetManager: Task 9 in stage 38.0 failed 4
> times; aborting job
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 12.3 in stage 38.0 (TID
> 1448, 172.31.18.215): UnknownReason
> Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due to stage failure: Task 9 in stage 38.0 failed 4 times, most recent
> failure: Los
> Driver stacktrace:
>     at org.apache.spark.scheduler.DAGScheduler.org
> $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283)
>     at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271)
>     at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270)
>     at
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>     at
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270)
>     at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
>     at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
>     at scala.Option.foreach(Option.scala:236)
>     at
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
>     at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496)
>     at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)
>     at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)
>     at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>     at
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
>     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822)
>     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835)
>     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1912)
>     at
> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:37)
>     at
> com.datastax.spark.connector.japi.RDDJavaFunctions.saveToCassandra(RDDJavaFunctions.java:61)
>     at
> com.datastax.spark.connector.japi.RDDAndDStreamCommonJavaFunctions$WriterBuilder.saveToCassandra(RDDAndDStreamCommonJavaFunctions.java:443)
>     at
> com.objectway.dwx.analytics.main.AuditControllerIPGeolocate.main(AuditControllerIPGeolocate.java:174)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>     at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 15/10/14 15:54:07 WARN ThrowableSerializationWrapper: Task exception could
> not be deserialized
> java.lang.ClassNotFoundException:
> com.datastax.spark.connector.types.TypeConversionException
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:278)
>     at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
>     at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
>     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
>     at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
>     at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)
>     at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>     at
> org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> 15/10/14 15:54:07 ERROR TaskResultGetter: Could not deserialize
> TaskEndReason: ClassNotFound with classloader
> org.apache.spark.util.MutableURLClassLoader@4
> 15/10/14 15:54:07 WARN TaskSetManager: Lost task 4.3 in stage 38.0 (TID
> 1449, 172.31.18.215): UnknownReason
>
>
> Any idea on what goes wrong?
>

Reply via email to