Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18982#discussion_r176316173
  
    --- Diff: python/pyspark/ml/wrapper.py ---
    @@ -118,11 +118,18 @@ def _transfer_params_to_java(self):
             """
             Transforms the embedded params to the companion Java object.
             """
    -        paramMap = self.extractParamMap()
    +        pair_defaults = []
             for param in self.params:
    -            if param in paramMap:
    -                pair = self._make_java_param_pair(param, paramMap[param])
    +            if self.isSet(param):
    +                pair = self._make_java_param_pair(param, 
self._paramMap[param])
                     self._java_obj.set(pair)
    +            if self.hasDefault(param):
    +                pair = self._make_java_param_pair(param, 
self._defaultParamMap[param])
    +                pair_defaults.append(pair)
    +        if len(pair_defaults) > 0:
    +            sc = SparkContext._active_spark_context
    +            pair_defaults_seq = sc._jvm.PythonUtils.toSeq(pair_defaults)
    +            self._java_obj.setDefault(pair_defaults_seq)
    --- End diff --
    
    If java side and python side the default params are the same, do we still 
need to set default params for the java object? Are't they already be set if 
they are default params?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to