You should consider upgrading your spark from 1.3.0 to a higher version.

Thanks
Best Regards

On Mon, Sep 14, 2015 at 2:28 PM, Priya Ch <learnings.chitt...@gmail.com>
wrote:

> Hi All,
>
>  I came across the related old conversation on the above issue (
> https://issues.apache.org/jira/browse/SPARK-5594. ) Is the issue fixed? I
> tried different values for spark.cleaner.ttl  -> 0sec, -1sec,
> 2000sec,..none of them worked. I also tried setting
> spark.streaming.unpersist -> true. What is the possible solution for this ?
> Is this a bug in Spark 1.3.0? Changing the scheduling mode to Stand-alone
> or Mesos mode would work fine ??
>
> Please someone share your views on this.
>
> On Sat, Sep 12, 2015 at 11:04 PM, Priya Ch <learnings.chitt...@gmail.com>
> wrote:
>
>> Hello All,
>>
>>  When I push messages into kafka and read into streaming application, I
>> see the following exception-
>>  I am running the application on YARN and no where broadcasting the
>> message within the application. Just simply reading message, parsing it and
>> populating fields in a class and then printing the dstream (using
>> DStream.print).
>>
>>  Have no clue if this is cluster issue or spark version issue or node
>> issue. The strange part is, sometimes the message is processed but
>> sometimes I see the below exception -
>>
>> java.io.IOException: org.apache.spark.SparkException: Failed to get
>> broadcast_5_piece0 of broadcast_5
>>         at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1155)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:164)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:87)
>>         at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)
>>         at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58)
>>         at org.apache.spark.scheduler.Task.run(Task.scala:64)
>>         at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
>>         at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>         at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>         at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.apache.spark.SparkException: Failed to get
>> broadcast_5_piece0 of broadcast_5
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1$$anonfun$2.apply(TorrentBroadcast.scala:137)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1$$anonfun$2.apply(TorrentBroadcast.scala:137)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply$mcVI$sp(TorrentBroadcast.scala:136)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:119)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:119)
>>         at scala.collection.immutable.List.foreach(List.scala:318)
>>         at org.apache.spark.broadcast.TorrentBroadcast.org
>> <http://org.apache.spark.broadcast.torrentbroadcast.org/>
>> $apache$spark$broadcast$TorrentBroadcast$$readBlocks(TorrentBroadcast.scala:119)
>>         at
>> org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:174)
>>         at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1152)
>>
>>
>> I would be glad if someone can throw some light on this.
>>
>> Thanks,
>> Padma Ch
>>
>>
>

Reply via email to