t; Cheers
>
> On Thu, Jan 28, 2016 at 1:38 PM, Jason Plurad wrote:
>
>> I've searched through the mailing list archive. It seems that if you try
>> to run, for example, a Spark 1.5.2 program against a Spark 1.5.1 standalone
>> server, you will run into an exception lik
I've searched through the mailing list archive. It seems that if you try to
run, for example, a Spark 1.5.2 program against a Spark 1.5.1 standalone
server, you will run into an exception like this:
WARN org.apache.spark.scheduler.TaskSetManager - Lost task 0.0 in stage
0.0 (TID 0, 192.168.14.10