This is an automated email from the ASF dual-hosted git repository.

weichenxu123 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


    from a180e02  [SPARK-32852][SQL][DOC][FOLLOWUP] Revise the documentation of 
spark.sql.hive.metastore.jars
     add 689c294  [SPARK-32907][ML][PYTHON] Adaptively blockify instances - 
AFT,LiR,LoR

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/ml/classification/LinearSVC.scala |   6 +-
 .../ml/classification/LogisticRegression.scala     | 114 +++++------------
 .../spark/ml/optim/aggregator/AFTAggregator.scala  |  42 +++----
 .../ml/optim/aggregator/HingeAggregator.scala      |   6 +-
 .../ml/optim/aggregator/HuberAggregator.scala      |  11 +-
 .../optim/aggregator/LeastSquaresAggregator.scala  |  11 +-
 .../ml/optim/aggregator/LogisticAggregator.scala   |  19 +--
 .../ml/param/shared/SharedParamsCodeGen.scala      |   4 +-
 .../spark/ml/param/shared/sharedParams.scala       |   4 +-
 .../ml/regression/AFTSurvivalRegression.scala      | 130 +++++++------------
 .../spark/ml/regression/LinearRegression.scala     | 138 ++++++---------------
 .../mllib/classification/LogisticRegression.scala  |   4 +-
 .../classification/LogisticRegressionSuite.scala   |   8 +-
 .../ml/regression/AFTSurvivalRegressionSuite.scala |   4 +-
 .../ml/regression/LinearRegressionSuite.scala      |   4 +-
 python/pyspark/ml/classification.py                |  22 ++--
 python/pyspark/ml/classification.pyi               |   8 +-
 python/pyspark/ml/param/_shared_params_code_gen.py |   4 +-
 python/pyspark/ml/param/shared.py                  |   4 +-
 python/pyspark/ml/regression.py                    |  46 +++----
 python/pyspark/ml/regression.pyi                   |  18 +--
 21 files changed, 237 insertions(+), 370 deletions(-)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to