[ 
https://issues.apache.org/jira/browse/SPARK-5887?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Wendell resolved SPARK-5887.
------------------------------------
    Resolution: Invalid

The Datastax connector is not part of the Apache Spark distribution, it's 
maintained by Datastax directly. So please reach out to them for support. 
Thanks!

> Class not found exception  
> com.datastax.spark.connector.rdd.partitioner.CassandraPartition
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5887
>                 URL: https://issues.apache.org/jira/browse/SPARK-5887
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Spark 1.2.1
> Spark Cassandra Connector 1.2.0 Alpha2
>            Reporter: Vijay Pawnarkar
>
> I am getting following class not found exception when using Spark 1.2.1 with 
> spark-cassandra-connector_2.10-1.2.0-alpha2. When the job is submitted to 
> Spark.. it successfully adds required connector JAR file to Worker's 
> classpath. Corresponding log entries are also included in following 
> description.
> From log statements and looking at spark 1.2.1 codebase it looks like the JAR 
> get added to urlClassLoader via "Executor.scala's updateDependencies method". 
> However when it time to execute the Task, its not able to resolve the class 
> name. 
> ----------------------------
> [task-result-getter-0] WARN org.apache.spark.scheduler.TaskSetManager - Lost 
> task 0.0 in stage 0.0 (TID 0, 127.0.0.1): java.lang.ClassNotFoundException: 
> com.datastax.spark.connector.rdd.partitioner.CassandraPartition
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:274)
> at 
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:59)
> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
> at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> ------------------------------
> LOG indicating JAR files were added to worker classpath.
> 15/02/17 16:56:48 INFO Executor: Fetching 
> http://127.0.0.1:64265/jars/spark-cassandra-connector-java_2.10-1.2.0-alpha2.jar
>  with timestamp 1424210185005
> 15/02/17 16:56:48 INFO Utils: Fetching 
> http://127.0.0.1:64265/jars/spark-cassandra-connector-java_2.10-1.2.0-alpha2.jar
>  to 
> C:\Users\sparkus\AppData\Local\Temp\spark-10f5e149-5460-4899-9c8f-b19b19bdaf55\spark-fba24b2b-5847-4b04-848c-90677d12ff99\spark-35f5ed4b-041d-40d8-8854-b243787de188\fetchFileTemp4665176275367448514.tmp
> 15/02/17 16:56:48 DEBUG Utils: fetchFile not using security
> 15/02/17 16:56:48 INFO Utils: Copying 
> C:\Users\sparkus\AppData\Local\Temp\spark-10f5e149-5460-4899-9c8f-b19b19bdaf55\spark-fba24b2b-5847-4b04-848c-90677d12ff99\spark-35f5ed4b-041d-40d8-8854-b243787de188\16215993091424210185005_cache
>  to 
> C:\localapps\spark-1.2.1-bin-hadoop2.4\work\app-20150217165625-0006\0\.\spark-cassandra-connector-java_2.10-1.2.0-alpha2.jar
> 15/02/17 16:56:48 INFO Executor: Adding 
> file:/C:/localapps/spark-1.2.1-bin-hadoop2.4/work/app-20150217165625-0006/0/./spark-cassandra-connector-java_2.10-1.2.0-alpha2.jar
>  to class loader
> 15/02/17 16:56:50 INFO Executor: Fetching 
> http://127.0.0.1:64265/jars/spark-cassandra-connector_2.10-1.2.0-alpha2.jar 
> with timestamp 1424210185012
> 15/02/17 16:56:50 INFO Utils: Fetching 
> http://127.0.0.1:64265/jars/spark-cassandra-connector_2.10-1.2.0-alpha2.jar 
> to 
> C:\Users\sparkus\AppData\Local\Temp\spark-10f5e149-5460-4899-9c8f-b19b19bdaf55\spark-fba24b2b-5847-4b04-848c-90677d12ff99\spark-78373f0b-053b-4c43-bd7c-da733e58ab0d\fetchFileTemp3822867177146190341.tmp
> 15/02/17 16:56:50 DEBUG Utils: fetchFile not using security
> 15/02/17 16:56:50 INFO Utils: Copying 
> C:\Users\sparkus\AppData\Local\Temp\spark-10f5e149-5460-4899-9c8f-b19b19bdaf55\spark-fba24b2b-5847-4b04-848c-90677d12ff99\spark-78373f0b-053b-4c43-bd7c-da733e58ab0d\16318572381424210185012_cache
>  to 
> C:\localapps\spark-1.2.1-bin-hadoop2.4\work\app-20150217165625-0006\0\.\spark-cassandra-connector_2.10-1.2.0-alpha2.jar
> 15/02/17 16:56:50 INFO Executor: Adding 
> file:/C:/localapps/spark-1.2.1-bin-hadoop2.4/work/app-20150217165625-0006/0/./spark-cassandra-connector_2.10-1.2.0-alpha2.jar
>  to class loader



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to