Hi Madhvi,
Thanks for posting this. I'm not super familiar, but my hunch is that
Spark requires objects that it works with to implement the Java
Serializable interface.
Accumulo deals with Key (and Value) through Hadoop's Writable interface
(technically WritableComparable, but still stems
Hi josh,
We solved it using the kryo serializer library to serialise the key class.
Thanks
vaibhav
On 28-Apr-2015 11:14 pm, Josh Elser josh.el...@gmail.com wrote:
Hi Madhvi,
Thanks for posting this. I'm not super familiar, but my hunch is that
Spark requires objects that it works with to
Hi,
While connecting to accumulo through spark by making sparkRDD I am
getting the following error:
object not serializable (class: org.apache.accumulo.core.data.Key)
This is due to the 'key' class of accumulo which does not implement
serializable interface.How it can be solved and accumulo