[ https://issues.apache.org/jira/browse/SPARK-7513?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14537736#comment-14537736 ]
Sean Owen commented on SPARK-7513: ---------------------------------- No, it clearly shows this is the problem: {{java.io.NotSerializableException: com.datastax.driver.core.UDTValue}} > Spark connecting cassandra throws NotSerialization error > -------------------------------------------------------- > > Key: SPARK-7513 > URL: https://issues.apache.org/jira/browse/SPARK-7513 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 1.3.0 > Reporter: madheswaran > > Connecting Cassandra version 2.1.2 : > UDT: > CREATE TYPE fieldmap ( > key text, > value text > ); > column family: > CREATE TABLE quantity ( > device uuid, > date timestamp, > inserttime timestamp, > customtag text, > data text, > devicetime timestamp, > extension list<frozen<fieldmap>>, > systemtag text, > PRIMARY KEY ((device, date), inserttime) > ) WITH CLUSTERING ORDER BY (inserttime ASC) > java.io.NotSerializableException: com.datastax.driver.core.UDTValue > Serialization stack: > at > org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:38) > at > org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47) > at > org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:80) > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > 15/05/10 09:57:39 ERROR TaskSetManager: Task 0.0 in stage 1.0 (TID 1) had a > not serializable result: com.datastax.driver.core.UDTValue > Serialization stack: -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org