[ https://issues.apache.org/jira/browse/SPARK-6980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14529435#comment-14529435 ]
Harsh Gupta commented on SPARK-6980: ------------------------------------ [~irashid][~bryanc] Guys I am having trouble importing my spark project in intellij idea. I know it sounds silly because there's a confulence page to guide on how to set up project also there are lots of resolution on other platforms related to it. I tried many of them but ended up with some or the other new problem. Can you please help me in this regard as it's a blocker for me to go forward.FYI I am stuck at hiveshim error right now. > Akka timeout exceptions indicate which conf controls them > --------------------------------------------------------- > > Key: SPARK-6980 > URL: https://issues.apache.org/jira/browse/SPARK-6980 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Reporter: Imran Rashid > Assignee: Harsh Gupta > Priority: Minor > Labels: starter > Attachments: Spark-6980-Test.scala > > > If you hit one of the akka timeouts, you just get an exception like > {code} > java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] > {code} > The exception doesn't indicate how to change the timeout, though there is > usually (always?) a corresponding setting in {{SparkConf}} . It would be > nice if the exception including the relevant setting. > I think this should be pretty easy to do -- we just need to create something > like a {{NamedTimeout}}. It would have its own {{await}} method, catches the > akka timeout and throws its own exception. We should change > {{RpcUtils.askTimeout}} and {{RpcUtils.lookupTimeout}} to always give a > {{NamedTimeout}}, so we can be sure that anytime we have a timeout, we get a > better exception. > Given the latest refactoring to the rpc layer, this needs to be done in both > {{AkkaUtils}} and {{AkkaRpcEndpoint}}. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org