Hi Akhil,

I've done this for the classes which are in my scope. But what to do with
classes that are out of my scope?
For example org.apache.hadoop.io.Text

Also I'm using several 3rd part libraries like "jeval".

~Sarath


On Fri, Sep 5, 2014 at 7:40 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> You can bring those classes out of the library and Serialize it
> (implements Serializable). It is not the right way of doing it though it
> solved few of my similar problems.
>
> Thanks
> Best Regards
>
>
> On Fri, Sep 5, 2014 at 7:36 PM, Sarath Chandra <
> sarathchandra.jos...@algofusiontech.com> wrote:
>
>> Hi,
>>
>> I'm trying to migrate a map-reduce program to work with spark. I migrated
>> the program from Java to Scala. The map-reduce program basically loads a
>> HDFS file and for each line in the file it applies several transformation
>> functions available in various external libraries.
>>
>> When I execute this over spark, it is throwing me "Task not serializable"
>> exceptions for each and every class being used from these from external
>> libraries. I included serialization to few classes which are in my scope,
>> but there there are several other classes which are out of my scope like
>> org.apache.hadoop.io.Text.
>>
>> How to overcome these exceptions?
>>
>> ~Sarath.
>>
>
>

Reply via email to