My hunch is that you changed spark.serializer to Kryo but left
spark.closureSerializer unmodified, so it's still using Java for closure
serialization.  Kryo doesn't really work as a closure serializer but
there's an open pull request to fix this:
https://github.com/apache/spark/pull/6361

On Mon, Jun 22, 2015 at 5:42 AM, Sean Barzilay <sesnbarzi...@gmail.com>
wrote:

> My program is written in Scala. I am creating a jar and submitting it
> using spark-submit.
> My code is on a computer in an internal network withe no internet so I
> can't send it.
>
> On Mon, Jun 22, 2015, 3:19 PM Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> How are you submitting the application? Could you paste the code that you
>> are running?
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Jun 22, 2015 at 5:37 PM, Sean Barzilay <sesnbarzi...@gmail.com>
>> wrote:
>>
>>> I am trying to run a function on every line of a parquet file. The
>>> function is in an object. When I run the program, I get an exception that
>>> the object is not serializable. I read around the internet and found that I
>>> should use Kryo Serializer. I changed the setting in the spark conf and
>>> registered the object to the Kryo Serializer. When I run the program I
>>> still get the same exception (from the stack trace: "at
>>> org.apache.spark.serializer.JavaSerializationStream.write
>>> object(JavaSerializer.scala:47)"). For some reason, the program is still
>>> trying to serialize using the default java Serializer. I am working with
>>> spark 1.4.
>>>
>>
>>

Reply via email to