Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18982#discussion_r176828963
  
    --- Diff: python/pyspark/ml/wrapper.py ---
    @@ -118,11 +118,18 @@ def _transfer_params_to_java(self):
             """
             Transforms the embedded params to the companion Java object.
             """
    -        paramMap = self.extractParamMap()
    +        pair_defaults = []
             for param in self.params:
    -            if param in paramMap:
    -                pair = self._make_java_param_pair(param, paramMap[param])
    +            if self.isSet(param):
    +                pair = self._make_java_param_pair(param, 
self._paramMap[param])
                     self._java_obj.set(pair)
    +            if self.hasDefault(param):
    +                pair = self._make_java_param_pair(param, 
self._defaultParamMap[param])
    +                pair_defaults.append(pair)
    +        if len(pair_defaults) > 0:
    +            sc = SparkContext._active_spark_context
    +            pair_defaults_seq = sc._jvm.PythonUtils.toSeq(pair_defaults)
    +            self._java_obj.setDefault(pair_defaults_seq)
    --- End diff --
    
    I think this is reasonable, a few extra lines to avoid potential unwanted 
user surprise is worth it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to