Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19309#discussion_r141712571
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -2687,6 +2688,59 @@ private[spark] object Utils extends Logging {
       def stringToSeq(str: String): Seq[String] = {
         str.split(",").map(_.trim()).filter(_.nonEmpty)
       }
    +
    +  /**
    +   * Create instances of extension classes.
    +   *
    +   * The classes in the given list must:
    +   * - Be sub-classes of the given base class.
    +   * - Provide either a no-arg constructor, or a 1-arg constructor that 
takes a SparkConf.
    +   *
    +   * The constructors are allowed to throw "UnsupportedOperationException" 
if the extension does not
    +   * want to be registered; this allows the implementations to check the 
Spark configuration (or
    +   * other state) and decide they do not need to be added. A log message 
is printed in that case.
    +   * Other exceptions are bubbled up.
    +   */
    +  def loadExtensions[T](extClass: Class[T], classes: Seq[String], conf: 
SparkConf): Seq[T] = {
    +    classes.flatMap { name =>
    +      try {
    +        val klass = classForName(name)
    +        require(extClass.isAssignableFrom(klass),
    +          s"$name is not a subclass of ${extClass.getName()}.")
    +
    +        val ext = Try(klass.getConstructor(classOf[SparkConf])) match {
    +          case Success(ctor) =>
    +            ctor.newInstance(conf)
    +
    +          case Failure(_) =>
    +            klass.getConstructor().newInstance()
    +        }
    +
    +        Some(ext.asInstanceOf[T])
    +      } catch {
    +        case _: NoSuchMethodException =>
    +          throw new SparkException(
    +            s"$name did not have a zero-argument constructor or a" +
    +              " single-argument constructor that accepts SparkConf. Note: 
if the class is" +
    +              " defined inside of another Scala class, then its 
constructors may accept an" +
    +              " implicit parameter that references the enclosing class; in 
this case, you must" +
    +              " define the class as a top-level class in order to prevent 
this extra" +
    +              " parameter from breaking Spark's ability to find a valid 
constructor.")
    +
    +        case e: InvocationTargetException =>
    +          e.getCause() match {
    +            case _: UnsupportedOperationException =>
    --- End diff --
    
    I am a bit worried catching UnsupportedOperationException is a bit too 
general ... what if its thrown by something else deep inside the extension 
class?  The user won't get to see what happened.
    
    OTOH, I can't think of anything better to catch (doesn't seem worth a 
special exception type).  Maybe log the entire exception at debug level?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to