Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15377#discussion_r84241076
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -2432,6 +2432,18 @@ private[spark] object Utils extends Logging {
       }
     }
     
    +private[util] object CallerContext {
    +  val callerContextSupported: Boolean = {
    +    SparkHadoopUtil
    --- End diff --
    
    Let's find a different way to wrap this; it's non-standard and hard to read 
the condition here. Maybe break out `conf` as a local `val` to make it simpler.
    
    I see why you're using Try, though I wonder if we are suppressing 
potentially important errors this way. For example, ClassNotFoundException 
obviously means 'not supported' and that's normal. A fatal error should 
probably just propagate. But a NonFatal error could be logged with a warning? 
because that would be unexpected. It would still mean the caller context isn't 
supported rather than cause a failure.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to