[ https://issues.apache.org/jira/browse/SPARK-14505?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15233863#comment-15233863 ]
The sea commented on SPARK-14505: --------------------------------- I know that we only support one SparkContext in the same jvm.But when running the code like below, it will throw exception at line 10. Is this not a problem? 1. val sc = new SparkContext("spark://master:7077", "app") 2. println(sc.range(1, 10).reduce(_ + _)) 3. try { 4. val sc2 = new SparkContext("local", "app") // here will throw exception,but it will change SparkEnv! 5. println(sc2.range(1, 10).reduce(_ + _)) 6. } catch { 7. case e: Exception => 8. e.printStackTrace() 9. } 10. println(sc.range(1, 10).reduce(_ + _)) > Creating two SparkContext Object in the same jvm, the first one will can not > run any tasks! > -------------------------------------------------------------------------------------------- > > Key: SPARK-14505 > URL: https://issues.apache.org/jira/browse/SPARK-14505 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.6.1 > Reporter: The sea > Priority: Minor > > Execute code below in spark shell: > import org.apache.spark.SparkContext > val sc = new SparkContext("local", "app") > sc.range(1, 10).reduce(_ + _) > The exception is : > 16/04/09 15:40:01 WARN scheduler.TaskSetManager: Lost task 1.0 in stage 1.0 > (TID 3, 192.168.172.131): java.io.IOException: > org.apache.spark.SparkException: Failed to get broadcast_1_piece0 of > broadcast_1 > at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1222) > at > org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:165) > at > org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64) > at > org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64) > at > org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:88) > at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62) > at org.apache.spark.scheduler.Task.run(Task.scala:89) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.spark.SparkException: Failed to get broadcast_1_piece0 > of broadcast_1 > at > org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1$$anonfun$2.apply(TorrentBroadcast.scala:138) > at > org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1$$anonfun$2.apply(TorrentBroadcast.scala:138) > at scala.Option.getOrElse(Option.scala:120) > at > org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply$mcVI$sp(TorrentBroadcast.scala:137) > at > org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:120) > at > org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:120) > at scala.collection.immutable.List.foreach(List.scala:318) > at > org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$readBlocks(TorrentBroadcast.scala:120) > at > org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:175) > at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1219) > ... 11 more -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org