news20.binary's feature dimension is 1.35M. So the serialized task
size is above the default limit 10M. You need to set
spark.akka.frameSize to, e.g, 20. Due to a bug SPARK-1112, this
parameter is not passed to executors automatically, which causes Spark
freezes. This was fixed in the latest master and v1.0.1-rc2. If you
rebuild spark, remember to sync the assembly jar to workers. -Xiangrui

On Thu, Jul 10, 2014 at 7:56 AM, AlexanderRiggers
<alexander.rigg...@gmail.com> wrote:
> Tried the newest branch, but still get stuck on the same task: (kill) runJob
> at SlidingRDD.scala:74
>
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Terminal-freeze-during-SVM-Broken-pipe-tp9022p9304.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to