Hi josh,

We solved it using the kryo serializer library to serialise the key class.

Thanks
vaibhav
On 28-Apr-2015 11:14 pm, "Josh Elser" <josh.el...@gmail.com> wrote:

> Hi Madhvi,
>
> Thanks for posting this. I'm not super familiar, but my hunch is that
> Spark requires objects that it works with to implement the Java
> Serializable interface.
>
> Accumulo deals with Key (and Value) through Hadoop's Writable interface
> (technically WritableComparable, but still stems from Writable). I'm not
> sure if there's a way that you can inform Spark to use the Writable
> interface methods instead of Serializable.
>
> If there isn't a way to do this, I don't see any reason why we couldn't
> make Key (and Value) also implement Serializable for this use case. Please
> open an issue on JIRA for this -- we can track the investigation there.
>
> Thanks!
>
> - Josh
>
> madhvi wrote:
>
>> Hi,
>>
>> While connecting to accumulo through spark by making sparkRDD I am
>> getting the following error:
>> object not serializable (class: org.apache.accumulo.core.data.Key)
>>
>> This is due to the 'key' class of accumulo which does not implement
>> serializable interface.How it can be solved and accumulo can be used
>> with spark
>>
>> Thanks
>> Madhvi
>>
>

Reply via email to