This thread might give you some insights
http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201311.mbox/%3CCA+WVT8WXbEHac=N0GWxj-s9gqOkgG0VRL5B=ovjwexqm8ev...@mail.gmail.com%3E
Thanks
Best Regards
On Fri, Apr 3, 2015 at 3:53 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:
My Spark Job
I was able to write record that extends specificrecord (avro) this class was
not auto generated. Do we need to do something extra for auto generated classes
Sent from my iPhone
On 03-Apr-2015, at 5:06 pm, Akhil Das ak...@sigmoidanalytics.com wrote:
This thread might give you some insights
I meant that I did not have to use kyro. Why will kyro help fix this issue now ?
Sent from my iPhone
On 03-Apr-2015, at 5:36 pm, Deepak Jain deepuj...@gmail.com wrote:
I was able to write record that extends specificrecord (avro) this class was
not auto generated. Do we need to do
Because, its throwing up serializable exceptions and kryo is a serializer
to serialize your objects.
Thanks
Best Regards
On Fri, Apr 3, 2015 at 5:37 PM, Deepak Jain deepuj...@gmail.com wrote:
I meant that I did not have to use kyro. Why will kyro help fix this issue
now ?
Sent from my
My Spark Job failed with
15/04/03 03:15:36 INFO scheduler.DAGScheduler: Job 0 failed:
saveAsNewAPIHadoopFile at AbstractInputHelper.scala:103, took 2.480175 s
15/04/03 03:15:36 ERROR yarn.ApplicationMaster: User class threw exception:
Job aborted due to stage failure: Task 0.0 in stage 2.0 (TID
You’ll definitely want to use a Kryo-based serializer for Avro. We have a Kryo
based serializer that wraps the Avro efficient serializer here.
Frank Austin Nothaft
fnoth...@berkeley.edu
fnoth...@eecs.berkeley.edu
202-340-0466
On Apr 3, 2015, at 5:41 AM, Akhil Das ak...@sigmoidanalytics.com