On Tuesday 28 April 2015 01:39 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:

val conf = new SparkConf()

      .setAppName(detail)

.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

.set("spark.kryoserializer.buffer.mb", arguments.get("buffersize").get)

.set("spark.kryoserializer.buffer.max.mb", arguments.get("maxbuffersize").get)

.set("spark.driver.maxResultSize", arguments.get("maxResultSize").get)

.registerKryoClasses(Array(classOf[org.apache.accumulo.core.data.Key]))


Can you try this ?


On Tue, Apr 28, 2015 at 11:11 AM, madhvi <madhvi.gu...@orkash.com <mailto:madhvi.gu...@orkash.com>> wrote:

    Hi,

    While connecting to accumulo through spark by making sparkRDD I am
    getting the following error:
     object not serializable (class: org.apache.accumulo.core.data.Key)

    This is due to the 'key' class of accumulo which does not
    implement serializable interface.How it can be solved and accumulo
    can be used with spark

    Thanks
    Madhvi

    ---------------------------------------------------------------------
    To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
    <mailto:user-unsubscr...@spark.apache.org>
    For additional commands, e-mail: user-h...@spark.apache.org
    <mailto:user-h...@spark.apache.org>




--
Deepak

Hi Deepak,

The snippet you proveide is of scala but I am working on java.I am tryng the same thing in java but please can you specify in detail what are the parameters you mentioned in that such as 'arguements'.

Thanks
Madhvi

Reply via email to