So I installed spark on each of the slaves 1.3.1 built with hadoop2.6 I just 
basically got the pre-built from the spark website…

I placed those compiled spark installs on each slave at /opt/spark

My spark properties seem to be getting picked up on my side fine…

[cid:683C1BA0-C9EC-448C-B1DB-E93AC4576DE9@coldlight.corp]
The framework is registered in Mesos, it shows up just fine, it doesn’t matter 
if I turn off the executor uri or not, but I always get the same error…

org.apache.spark.SparkException: Job aborted due to stage failure: Task 6 in 
stage 0.0 failed 4 times, most recent failure: Lost task 6.3 in stage 0.0 (TID 
23, 10.253.1.117): ExecutorLostFailure (executor 
20150424-104711-1375862026-5050-20113-S1 lost)
Driver stacktrace:
at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1204)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1193)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)
at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1192)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
at scala.Option.foreach(Option.scala:236)
at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)
at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

These boxes are totally open to one another so they shouldn’t have any firewall 
issues, everything seems to show up in mesos and spark just fine, but actually 
running stuff totally blows up.

There is nothing in the stderr or stdout, it downloads the package and untars 
it but doesn’t seem to do much after that. Any insights?

Steve


On Apr 24, 2015, at 5:50 PM, Yang Lei 
<genia...@gmail.com<mailto:genia...@gmail.com>> wrote:

SPARK_PUBLIC_DNS, SPARK_LOCAL_IP, SPARK_LOCAL_HOST

This e-mail is intended solely for the above-mentioned recipient and it may 
contain confidential or privileged information. If you have received it in 
error, please notify us immediately and delete the e-mail. You must not copy, 
distribute, disclose or take any action in reliance on it. In addition, the 
contents of an attachment to this e-mail may contain software viruses which 
could damage your own computer system. While ColdLight Solutions, LLC has taken 
every reasonable precaution to minimize this risk, we cannot accept liability 
for any damage which you sustain as a result of software viruses. You should 
perform your own virus checks before opening the attachment.

Reply via email to