liupengcheng created SPARK-26501: ------------------------------------ Summary: Fix overrided exitFn in SparkSubmitSuite Key: SPARK-26501 URL: https://issues.apache.org/jira/browse/SPARK-26501 Project: Spark Issue Type: Bug Components: Deploy, Spark Core Affects Versions: 2.4.0, 2.3.2 Reporter: liupengcheng
When I run SparkSubmitSuite of spark2.3.2 in intellij IDE, I found that some tests cannot pass when I run them one by one, but they passed when the whole SparkSubmitSuite was run. Failed tests when ran seperately: {code:java} test("SPARK_CONF_DIR overrides spark-defaults.conf") { forConfDir(Map("spark.executor.memory" -> "2.3g")) { path => val unusedJar = TestUtils.createJarWithClasses(Seq.empty) val args = Seq( "--class", SimpleApplicationTest.getClass.getName.stripSuffix("$"), "--name", "testApp", "--master", "local", unusedJar.toString) val appArgs = new SparkSubmitArguments(args, Map("SPARK_CONF_DIR" -> path)) assert(appArgs.defaultPropertiesFile != null) assert(appArgs.defaultPropertiesFile.startsWith(path)) assert(appArgs.propertiesFile == null) appArgs.executorMemory should be ("2.3g") } } {code} Failure reason: {code:java} Error: Executor Memory cores must be a positive number Run with --help for usage help or --verbose for debug output {code} After carefully checked the code, I found the exitFn of SparkSubmit is overrided by front tests in testPrematrueExit call. Although the above test was fixed by SPARK-22941, but the overriden of exitFn might cause other problems in the future. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org