[ 
https://issues.apache.org/jira/browse/SPARK-4810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14245660#comment-14245660
 ] 

Josh Rosen commented on SPARK-4810:
-----------------------------------

Can you update this ticket with additional information to help us debug?  Which 
Spark version are you using?  Which deployment mode?

> Failed to run collect
> ---------------------
>
>                 Key: SPARK-4810
>                 URL: https://issues.apache.org/jira/browse/SPARK-4810
>             Project: Spark
>          Issue Type: Question
>            Reporter: newjunwei
>
> my application failed like below.i want to know the possible reason.not 
> enough memory may cause this?
> no problem running in local  or running on another smaller data
> 2014-12-09 21:51:47,830 WARN 
> org.apache.spark.Logging$class.logWarning(Logging.scala:71) - Lost task 60.1 
> in stage 1.1 (TID 566, server-21): java.io.IOException: 
> org.apache.spark.SparkException: Failed to get broadcast_4_piece0 of 
> broadcast_4
>         org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:930)
>         
> org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:155)
>         sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
>         
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         java.lang.reflect.Method.invoke(Method.java:597)
>         java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
>         
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
>         java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
>         
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
>         java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
>         java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
>         
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>         
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:160)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>         
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>         java.lang.Thread.run(Thread.java:662)
> 2014-12-09 21:51:49,460 INFO 
> org.apache.spark.Logging$class.logInfo(Logging.scala:59) - Starting task 60.2 
> in stage 1.1 (TID 603, server-11, PROCESS_LOCAL, 1295 bytes)
> 2014-12-09 21:51:49,461 INFO 
> org.apache.spark.Logging$class.logInfo(Logging.scala:59) - Lost task 9.3 in 
> stage 1.1 (TID 579) on executor server-11: java.io.IOException 
> (org.apache.spark.SparkException: Failed to get broadcast_4_piece0 of 
> broadcast_4) [duplicate 1]
> 2014-12-09 21:51:49,487 ERROR 
> org.apache.spark.Logging$class.logError(Logging.scala:75) - Task 9 in stage 
> 1.1 failed 4 times; aborting job
> 2014-12-09 21:51:49,494 INFO 
> org.apache.spark.Logging$class.logInfo(Logging.scala:59) - Cancelling stage 1
> 2014-12-09 21:51:49,498 INFO 
> org.apache.spark.Logging$class.logInfo(Logging.scala:59) - Stage 1 was 
> cancelled
> 2014-12-09 21:51:49,511 INFO 
> org.apache.spark.Logging$class.logInfo(Logging.scala:59) - Failed to run 
> collect at StatVideoService.scala:62



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to