Github user steveloughran commented on a diff in the pull request: https://github.com/apache/spark/pull/18111#discussion_r133751724 --- Diff: core/src/main/scala/org/apache/spark/internal/io/HadoopMapReduceCommitProtocol.scala --- @@ -73,7 +73,10 @@ class HadoopMapReduceCommitProtocol(jobId: String, path: String) val stagingDir: String = committer match { // For FileOutputCommitter it has its own staging path called "work path". - case f: FileOutputCommitter => Option(f.getWorkPath.toString).getOrElse(path) + case f: FileOutputCommitter => + val workPath = f.getWorkPath + require(workPath != null, s"Committer has no workpath $f") + Option(workPath.toString).getOrElse(path) --- End diff -- {{workPath.toString()}} triggers an NPE, which was the reason for the stack trace. Now, what's this code trying to do? find a directory to put stuff. If the committer is a subclass of `FileOutputFormat`, then its getWorkPath method can be called, which *should* return a non-null path. If the requirement of the code is "get the workpath or return "path" if its null", maybe the code should really be ```scala Option(f.getWorkPath).getOrElse(path).toString ``` That'd return the workPath if non null, falling back to the `path` variable, and then call `toString` on the returned object. I'm not sure off the top of my head if Scala type inference likes that though. It may be less elegant and more reliable to have ```scala val workPath = f.getWorkPath if (workPath != null) workPath.toString else path; ``` +maybe a bonus paranoid check for workPath being "". I think that would actually get closer to the original goal of the code: always return a path, even of the committer doesn't have one
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org