Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17793
Thanks all.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r115114733
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,143 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17793
javaunibuild build results:
1. Doc for `ALS.train` looks fine.
1. No doc for `ALS.OutBlock` is generated; possibly because it's a type
def?
1. No doc for `ALS.InBlock
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r115112172
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,143 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17793
@srowen Great idea. Will do and report back.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17793
All comments have been addressed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17793
Great. Let me finish adding that one change @sethah requested, and I'll
update the PR sometime today.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114065812
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,127 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114044512
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,127 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114039105
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -1026,7 +1161,24 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114038722
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,127 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114038705
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,127 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114038426
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -910,26 +944,127 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r114038185
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -791,32 +813,43 @@ object ALS extends DefaultParamsReadable[ALS
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17793
How do I fix the âfails to generate documentationâ error?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user danielyli commented on a diff in the pull request:
https://github.com/apache/spark/pull/17793#discussion_r113894646
--- Diff: mllib/src/main/scala/org/apache/spark/ml/recommendation/ALS.scala
---
@@ -791,32 +813,43 @@ object ALS extends DefaultParamsReadable[ALS
GitHub user danielyli opened a pull request:
https://github.com/apache/spark/pull/17793
[SPARK-20484][MLLIB] Add documentation to ALS code
## What changes were proposed in this pull request?
This PR adds documentation to the ALS code.
## How was this patch tested
Github user danielyli closed the pull request at:
https://github.com/apache/spark/pull/17767
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17767
Closing this PR as per
[SPARK-20468](https://issues.apache.org/jira/browse/SPARK-20468?focusedCommentId=15984365=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15984365
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/17767
Hi @hhbyyh,
Thanks for the pointer. I've created a Jira ticket and renamed this issue
to reflect it.
The individual commits in this PR are written to be self-contained
GitHub user danielyli opened a pull request:
https://github.com/apache/spark/pull/17767
Als refactor
## What changes were proposed in this pull request?
This is a non-feature-changing refactoring of the ALS code (specifically,
the `org.apache.spark.ml.recommendation
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
I'm simply making an argument for a specific use case, though you're right,
it's used for more than just pattern matching.
---
If your project is set up for it, you can reply to this email
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
Hey,
Checking in again on this PR. Can we please support `withFilter` for pair
RDDs? For-expressions are a central sugar in Scala syntax, and without them
developers are hampered
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
@rxin, is it possible for Spark to support extractors in for expressions
with pair RDDs?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user danielyli commented on the issue:
https://github.com/apache/spark/pull/15899
Hello,
I found this issue after encountering the error `'withFilter' method does
not yet exist on RDD[(Int, Double)], using 'filter' method instead` in my code.
I'm writing
25 matches
Mail list logo