Hi, all:
According to https://github.com/apache/spark/pull/2732, When a spark job fails
or exits nonzero in yarn-cluster mode, the spark-submit will get the
corresponding return code of the spark job. But I tried in spark-1.1.1 yarn
cluster, spark-submit return zero anyway.
Here is my spark
I tried in spark client mode, spark-submit can get the correct return code from
spark job. But in yarn-cluster mode, It failed.
From: lin_q...@outlook.com
To: u...@spark.incubator.apache.org
Subject: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in
yarn-cluster mode
Date: Fri, 5
I tried anather test code: def main(args: Array[String]) {if (args.length
!= 1) { Util.printLog(ERROR, Args error - arg1: BASE_DIR)
exit(101) }val currentFile = args(0).toStringval DB = test_spark
val tableName = src
val sparkConf = new
What's the status of this application in the yarn web UI?
Best Regards,
Shixiong Zhu
2014-12-05 17:22 GMT+08:00 LinQili lin_q...@outlook.com:
I tried anather test code:
def main(args: Array[String]) {
if (args.length != 1) {
Util.printLog(ERROR, Args error - arg1: BASE_DIR)
There were two exit in this code. If the args was wrong, the spark-submit
will get the return code 101, but, if the args is correct, spark-submit
cannot get the second return code 100. What’s the difference between these
two exit? I was so confused.
I’m also confused. When I tried your codes,