nvm figured. I compiled my client jar with 2.0.2 while the spark that is
deployed on my machines were 2.0.1. communication problems between dev team
and ops team :)

On Fri, Jan 20, 2017 at 3:03 PM, kant kodali <kanth...@gmail.com> wrote:

> Is this because of versioning issue? can't wait for JDK 9 modular system.
> I am not sure if spark plans to leverage it?
>
> On Fri, Jan 20, 2017 at 1:30 PM, kant kodali <kanth...@gmail.com> wrote:
>
>> I get the following exception. I am using Spark 2.0.1 and Scala 2.11.8.
>>
>> org.apache.spark.SparkException: Job aborted due to stage failure: Task
>> 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
>> 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException:
>> org.apache.spark.executor.TaskMetrics; local class incompatible: stream
>> classdesc serialVersionUID = -2231953621568687904, local class
>> serialVersionUID = -6966587383730940799
>>
>
>

Reply via email to