[ https://issues.apache.org/jira/browse/SPARK-6980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14512575#comment-14512575 ]
Harsh Gupta commented on SPARK-6980: ------------------------------------ [~imranr] [~bryanc] Hi. I tried a simple example of actors producer and consumer by setting setTimeOut very low and was able to see the exception.I am not clear on how util methods in SparkConf would get NamedDuration.Although the wrapper approach sounds fine.Will do some more tweaks and post here(although won't be very active this week since need to get my primary laptop fixed) > Akka timeout exceptions indicate which conf controls them > --------------------------------------------------------- > > Key: SPARK-6980 > URL: https://issues.apache.org/jira/browse/SPARK-6980 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Reporter: Imran Rashid > Assignee: Harsh Gupta > Priority: Minor > Labels: starter > Attachments: Spark-6980-Test.scala > > > If you hit one of the akka timeouts, you just get an exception like > {code} > java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] > {code} > The exception doesn't indicate how to change the timeout, though there is > usually (always?) a corresponding setting in {{SparkConf}} . It would be > nice if the exception including the relevant setting. > I think this should be pretty easy to do -- we just need to create something > like a {{NamedTimeout}}. It would have its own {{await}} method, catches the > akka timeout and throws its own exception. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org