Thankyou Deepak.It worked.
Madhvi
On Tuesday 28 April 2015 01:39 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:

val conf = new SparkConf()

      .setAppName(detail)

.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

.set("spark.kryoserializer.buffer.mb", arguments.get("buffersize").get)

.set("spark.kryoserializer.buffer.max.mb", arguments.get("maxbuffersize").get)

.set("spark.driver.maxResultSize", arguments.get("maxResultSize").get)

.registerKryoClasses(Array(classOf[org.apache.accumulo.core.data.Key]))


Can you try this ?


On Tue, Apr 28, 2015 at 11:11 AM, madhvi <madhvi.gu...@orkash.com <mailto:madhvi.gu...@orkash.com>> wrote:

    Hi,

    While connecting to accumulo through spark by making sparkRDD I am
    getting the following error:
     object not serializable (class: org.apache.accumulo.core.data.Key)

    This is due to the 'key' class of accumulo which does not
    implement serializable interface.How it can be solved and accumulo
    can be used with spark

    Thanks
    Madhvi

    ---------------------------------------------------------------------
    To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
    <mailto:user-unsubscr...@spark.apache.org>
    For additional commands, e-mail: user-h...@spark.apache.org
    <mailto:user-h...@spark.apache.org>




--
Deepak


Reply via email to