Github user RussellSpitzer commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21990#discussion_r225805019
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala 
---
    @@ -1136,4 +1121,27 @@ object SparkSession extends Logging {
           SparkSession.clearDefaultSession()
         }
       }
    +
    +  /**
    +   * Initialize extensions if the user has defined a configurator class in 
their SparkConf.
    +   * This class will be applied to the extensions passed into this 
function.
    +   */
    +  private[sql] def applyExtensionsFromConf(conf: SparkConf, extensions: 
SparkSessionExtensions) {
    --- End diff --
    
    I am always a little nervous about having functions return objects they 
take in as parameters and then modify. Gives an impression to me that they are 
stateless. If you think that this is clearer I can make the change. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to