I did not encounter this with my Avro records using Spark 1.10 (see https://github.com/medale/spark-mail/blob/master/analytics/src/main/scala/com/uebercomputing/analytics/basic/UniqueSenderCounter.scala).

I do use the default Java serialization but all the fields in my Avro object are Serializable (no bytes/ByteBuffer). Does your Avro schema use bytes? If so, it seems that is wrapped in ByteBuffer, which is not Serializable. A quick search has a fix here:

https://groups.google.com/forum/#!topic/spark-users/6HQPuxsCe0c

Hope this helps,
Markus

On 12/17/2014 08:14 PM, touchdown wrote:
Yeah, I have the same problem with 1.1.0, but not 1.0.0.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-avro-mapred-AvroKey-using-spark-with-avro-tp15165p20752.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to