[ 
https://issues.apache.org/jira/browse/SPARK-31571?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-31571:
------------------------------------

    Assignee: Apache Spark

> don't use stop(paste to build R errors
> --------------------------------------
>
>                 Key: SPARK-31571
>                 URL: https://issues.apache.org/jira/browse/SPARK-31571
>             Project: Spark
>          Issue Type: Documentation
>          Components: R
>    Affects Versions: 2.4.5
>            Reporter: Michael Chirico
>            Assignee: Apache Spark
>            Priority: Minor
>
> I notice for example this:
> stop(paste0("Arrow optimization does not support 'dapplyCollect' yet. Please 
> disable ",
>             "Arrow optimization or use 'collect' and 'dapply' APIs instead."))
> paste0 is totally unnecessary here -- stop itself uses ... (vararg) with 
> default ''-sep combination, i.e., the above is equivalent to:
> stop("Arrow optimization does not support 'dapplyCollect' yet. Please disable 
> ",
>       "Arrow optimization or use 'collect' and 'dapply' APIs instead.")
> More generally, for portability, this will make it more difficult for 
> user-contributed translations because the standard set of tools for doing 
> this (namely tools::update_pkg_po('.")) would fail to capture these messages 
> as being candidates for translation.
> In fact, it's completely preferable IMO to keep the entire stop("") message 
> as a single string -- I've found that breaking the string across multiple 
> lines makes translation across different languages with different grammars 
> quite difficult. Understand there are lint style constraints however so I 
> wouldn't press on that for now.
> If formatting is needed, I recommend using stop(gettextf(...)) instead.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to