[ 
https://issues.apache.org/jira/browse/SPARK-17816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15627622#comment-15627622
 ] 

Jonathan Alvarado commented on SPARK-17816:
-------------------------------------------

Can I assume that I can disregard this error for proper operation of my spark 
job, but I just might lose some logging for the UI?  

The title of this bug says that "accumulators" are failing with 
ConcurrentModificationException.  However, the stacktrace shows the issues 
arising from the eventlog reporting which I understand to be related to event 
logging for the UI.  I'm having this error come up during my job and I need 
accumulators to work correctly for proper operation of my spark job. Can I 
assume that I can disregard this error for proper operation of my job, but I 
just might lose some logging for the UI?  Or are the accumulators not working 
correctly?


> Json serialzation of accumulators are failing with 
> ConcurrentModificationException
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-17816
>                 URL: https://issues.apache.org/jira/browse/SPARK-17816
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.1, 2.1.0
>            Reporter: Ergin Seyfe
>            Assignee: Ergin Seyfe
>             Fix For: 2.0.2, 2.1.0
>
>
> This is the stack trace: See  {{ConcurrentModificationException}}:
> {code}
> java.util.ConcurrentModificationException
> at java.util.ArrayList$Itr.checkForComodification(ArrayList.java:901)
> at java.util.ArrayList$Itr.next(ArrayList.java:851)
> at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:183)
> at scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:45)
> at scala.collection.TraversableLike$class.to(TraversableLike.scala:590)
> at scala.collection.AbstractTraversable.to(Traversable.scala:104)
> at scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:294)
> at scala.collection.AbstractTraversable.toList(Traversable.scala:104)
> at 
> org.apache.spark.util.JsonProtocol$.accumValueToJson(JsonProtocol.scala:314)
> at 
> org.apache.spark.util.JsonProtocol$$anonfun$accumulableInfoToJson$5.apply(JsonProtocol.scala:291)
> at 
> org.apache.spark.util.JsonProtocol$$anonfun$accumulableInfoToJson$5.apply(JsonProtocol.scala:291)
> at scala.Option.map(Option.scala:146)
> at 
> org.apache.spark.util.JsonProtocol$.accumulableInfoToJson(JsonProtocol.scala:291)
> at 
> org.apache.spark.util.JsonProtocol$$anonfun$taskInfoToJson$12.apply(JsonProtocol.scala:283)
> at 
> org.apache.spark.util.JsonProtocol$$anonfun$taskInfoToJson$12.apply(JsonProtocol.scala:283)
> at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at 
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
> at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
> at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
> at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> at org.apache.spark.util.JsonProtocol$.taskInfoToJson(JsonProtocol.scala:283)
> at org.apache.spark.util.JsonProtocol$.taskEndToJson(JsonProtocol.scala:145)
> at org.apache.spark.util.JsonProtocol$.sparkEventToJson(JsonProtocol.scala:76)
> at 
> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:137)
> at 
> org.apache.spark.scheduler.EventLoggingListener.onTaskEnd(EventLoggingListener.scala:157)
> at 
> org.apache.spark.scheduler.SparkListenerBus$class.doPostEvent(SparkListenerBus.scala:45)
> at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:35)
> at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:35)
> at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:63)
> at 
> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:35)
> at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:81)
> at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:66)
> at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:66)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
> at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:65)
> at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1244)
> at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:64)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to