[ https://issues.apache.org/jira/browse/SPARK-38139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17489521#comment-17489521 ]
zhengruifeng commented on SPARK-38139: -------------------------------------- I think it is ok to adjust the tol in this case > ml.recommendation.ALS doctests failures > --------------------------------------- > > Key: SPARK-38139 > URL: https://issues.apache.org/jira/browse/SPARK-38139 > Project: Spark > Issue Type: Bug > Components: ML, PySpark > Affects Versions: 3.3.0 > Reporter: Maciej Szymkiewicz > Priority: Major > > In my dev setups, ml.recommendation:ALS test consistently converges to value > lower than expected and fails with: > {code:python} > File "/path/to/spark/python/pyspark/ml/recommendation.py", line 322, in > __main__.ALS > Failed example: > predictions[0] > Expected: > Row(user=0, item=2, newPrediction=0.69291...) > Got: > Row(user=0, item=2, newPrediction=0.6929099559783936) > {code} > In can correct for that, but it creates some noise, so if anyone else > experiences this, we could drop a digit from the results > {code} > diff --git a/python/pyspark/ml/recommendation.py > b/python/pyspark/ml/recommendation.py > index f0628fb922..b8e2a6097d 100644 > --- a/python/pyspark/ml/recommendation.py > +++ b/python/pyspark/ml/recommendation.py > @@ -320,7 +320,7 @@ class ALS(JavaEstimator, _ALSParams, JavaMLWritable, > JavaMLReadable): > >>> test = spark.createDataFrame([(0, 2), (1, 0), (2, 0)], ["user", > "item"]) > >>> predictions = sorted(model.transform(test).collect(), key=lambda r: > r[0]) > >>> predictions[0] > - Row(user=0, item=2, newPrediction=0.69291...) > + Row(user=0, item=2, newPrediction=0.6929...) > >>> predictions[1] > Row(user=1, item=0, newPrediction=3.47356...) > >>> predictions[2] > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org