[ https://issues.apache.org/jira/browse/SPARK-5243?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
yuhao yang updated SPARK-5243: ------------------------------ Description: Spark will hang if calling spark-submit under the conditions: 1. the cluster has only one worker. 2. driver memory + executor memory > worker memory 3. deploy-mode = cluster This usually happens during development for beginners. There should be some exit mechanism or at least a warning message in the output of the spark-submit. I would like to know your opinions about if a fix is needed and better fix options. was: Spark will hang if calling spark-submit under the conditions: 1. the cluster has only one worker. 2. driver memory + executor memory > worker memory 3. deploy-mode = cluster This usually happens during development for beginners. There should be some exit mechanism or at least a warning message in the output of the spark-submit. I am preparing PR for the case. And I would like to know your opinions about if a fix is needed and better fix options. > Spark will hang if (driver memory + executor memory) exceeds limit on a > 1-worker cluster > ---------------------------------------------------------------------------------------- > > Key: SPARK-5243 > URL: https://issues.apache.org/jira/browse/SPARK-5243 > Project: Spark > Issue Type: Improvement > Components: Deploy > Affects Versions: 1.2.0 > Environment: centos, others should be similar > Reporter: yuhao yang > Priority: Minor > > Spark will hang if calling spark-submit under the conditions: > 1. the cluster has only one worker. > 2. driver memory + executor memory > worker memory > 3. deploy-mode = cluster > This usually happens during development for beginners. > There should be some exit mechanism or at least a warning message in the > output of the spark-submit. > I would like to know your opinions about if a fix is needed and better fix > options. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org