Could you tell me what all you did to change the version of spark?

Can you fireup a spark-shell and write this line and see what happens:

sc.parallelize(1 to 10000).collect()


Thanks
Best Regards

On Mon, Mar 16, 2015 at 11:13 PM, Eason Hu <eas...@gmail.com> wrote:

> Hi Akhil,
>
> Yes, I did change both versions on the project and the cluster.  Any clues?
>
> Even the sample code from Spark website failed to work.
>
> Thanks,
> Eason
>
> On Sun, Mar 15, 2015 at 11:56 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Did you change both the versions? The one in your build file of your
>> project and the spark version of your cluster?
>>
>> Thanks
>> Best Regards
>>
>> On Sat, Mar 14, 2015 at 6:47 AM, EH <eas...@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I've been using Spark 1.1.0 for a while, and now would like to upgrade to
>>> Spark 1.1.1 or above.  However, it throws the following errors:
>>>
>>> 18:05:31.522 [sparkDriver-akka.actor.default-dispatcher-3hread] ERROR
>>> TaskSchedulerImpl - Lost executor 37 on hcompute001: remote Akka client
>>> disassociated
>>> 18:05:31.530 [sparkDriver-akka.actor.default-dispatcher-3hread] WARN
>>> TaskSetManager - Lost task 0.0 in stage 1.0 (TID 0, hcompute001):
>>> ExecutorLostFailure (executor lost)
>>> 18:05:31.567 [sparkDriver-akka.actor.default-dispatcher-2hread] ERROR
>>> TaskSchedulerImpl - Lost executor 3 on hcompute001: remote Akka client
>>> disassociated
>>> 18:05:31.568 [sparkDriver-akka.actor.default-dispatcher-2hread] WARN
>>> TaskSetManager - Lost task 1.0 in stage 1.0 (TID 1, hcompute001):
>>> ExecutorLostFailure (executor lost)
>>> 18:05:31.988 [sparkDriver-akka.actor.default-dispatcher-23hread] ERROR
>>> TaskSchedulerImpl - Lost executor 24 on hcompute001: remote Akka client
>>> disassociated
>>>
>>> Do you know what may go wrong?  I didn't change any codes, just changed
>>> the
>>> version of Spark.
>>>
>>> Thank you all,
>>> Eason
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Upgrade-from-Spark-1-1-0-to-1-1-1-Issues-tp22045.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to