Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16214#discussion_r91765571
  
    --- Diff: docs/sparkr.md ---
    @@ -472,21 +472,17 @@ should fit in a single machine. If that is not the 
case they can do something li
     `dapply`
     
     <div data-lang="r"  markdown="1">
    -{% highlight r %}
    -# Perform distributed training of multiple models with spark.lapply. Here, 
we pass
    -# a read-only list of arguments which specifies family the generalized 
linear model should be.
    -families <- c("gaussian", "poisson")
    -train <- function(family) {
    -  model <- glm(Sepal.Length ~ Sepal.Width + Species, iris, family = family)
    -  summary(model)
    -}
    -# Return a list of model's summaries
    -model.summaries <- spark.lapply(families, train)
    +{% include_example lapply r/ml/ml.R %}
    +</div>
     
    -# Print the summary of each model
    -print(model.summaries)
    +##### spark.lapply with third-party packages
     
    -{% endhighlight %}
    +Many of the SparkR jobs distributed by `spark.lapply` need supports from 
third-party packages. Rather than installing all necessary packages to all 
executors in advance,
    +we could install them during the SparkR interactive session or script. 
Users can add package files or directories by `spark.addFile` firstly,
    +download them to every executor node, and install them.
    --- End diff --
    
    this kind of sounds like the user will need to separately "download them to 
executor node" - perhaps instead say "by `spark.addFile` first, which 
automatically download them to every executor node, and then install them`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to