Re: [R-pkg-devel] Cascade effect of non-available packages?

2024-11-03 Thread Dirk Eddelbuettel


Tiago,

Looking at https://www.stats.ox.ac.uk/pub/bdr/noSuggests/pliman.out
we see it errors after trying '* checking examples ...':

   * checking examples ... ERROR
   Running examples in ‘pliman-Ex.R’ failed
   The error most likely occurred in:
   
   > ### Name: as_image
   > ### Title: Create an 'Image' object
   > ### Aliases: as_image
   > 
   > ### ** Examples
   > 
   > img <-
   + as_image(rnorm(150 * 150 * 3),
   +  dim = c(150, 150, 3),
   +  colormode = 'Color')
   Error in loadNamespace(x) : there is no package called ‘EBImage’
   Calls: as_image ... loadNamespace -> withRestarts -> withOneRestart -> 
doWithOneRestart
   Execution halted
   * checking PDF version of manual ... [12s/12s] OK

Looking at as_image.Rd in 
https://github.com/NEPEM-UFSC/pliman/blob/24a1781073f9b1a3141002f0985b0542b1f7178d/man/as_image.Rd
we see that you do have an \donttest{} there but that alone does not protect
you. Instead of

   \examples{
   \donttest{
   library(pliman)
   img <-
   as_image(rnorm(150 * 150 * 3),
dim = c(150, 150, 3),
colormode = 'Color')
   plot(img)
   }

I would do something like

  if (interactive() && requireNamespace("EBImage", quietly=TRUE)) {
library(pliman)
img <- as_image(rnorm(150 * 150 * 3), dim = c(150, 150, 3), colormode = 
'Color')
plot(img)
  }

where you could of course put that condition into a helper function.

Hope this helps,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Cascade effect of non-available packages?

2024-11-03 Thread Dirk Eddelbuettel


On 3 November 2024 at 11:02, Tiago Olivoto wrote:
| Today, I noticed that several stable packages, such as Rcpp and sf, were
| unavailable during the check process for this submission. This wasn’t an
| issue in the previously published version of pliman. Could this be a
| temporary problem with the package availability on CRAN?

I could be. (Auto-)build systems for 21k packages and 3 architectures can be
fragile, and we are in time of transition (R 4.4.2 just came about) so this
could have been spurious.  If in doubt, check the status of a given package
at its CRAN page. But I looked at the log links yu provided, and I saw no
build failure over 'missing Rcpp and/or sf' there.  Can you point us to an
error?

Moreover, reading
  
https://win-builder.r-project.org/incoming_pretest/pliman_3.0.0_20241102_181842/specialChecks/noSuggests/summary.txt
I do not see your aforementioned 'check_ebi()' function at work. The
suggested package is not present, you are asked to not fail and just skip
but do not seem to do so. I think you need to fix that, and that the CRAN
messaging is fairly clear and standard here. I may of course have missed
something in which case 'my bad'.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R-extension requirement about third-party random number generators (RNG)

2024-09-27 Thread Dirk Eddelbuettel


On 27 September 2024 at 16:58, John Clarke wrote:
| [...] -- the RNG state management strategy appears almost 'magical' to me
| especially inside RCPP. It is possible, I just don't understand how to use it.

See Section 6.3 of WRE: The pair of GetRNGstate() and PutRNGstate() is all
there is in terms of an interface, and all we call (and old-school macros
ensure it ends in the generated glue code Rcpp produces for you). Pretty much
everything else around RNGSs is (quite) opaque (as Section 6.3 states) and
Rcpp does not go there.

Cheers,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R-extension requirement about third-party random number generators (RNG)

2024-09-27 Thread Dirk Eddelbuettel


Hi John,

I think you are reading the text too literally. The intent of WRE is to
ensure that standard use of a RNG in an extension package uses the RNGs that
come with R (which includes an updated mersenne twister algorithm) so that
users are not "surprised".  It explicitly mentions the problem of multiple
seeding.  An example of how this can be done is in RcppArmadillo: a long time
ago we worked out a scheme were in the R use case the RNG is 'dropped in' so
that Armadillo code uses randu(5) you get what runif(5) in R would give you
(given the same seed from R).

On the other hand, when you know what you do and properly (locally)
instantiate another PRNG for local use you can. For quick checks I often use
a query at github in the 'cran' organisation mirroring the CRAN repos. For
example the following shows where std::mt19937 is used in C++ to deploy the
Mersenne Twister. So as you can see, when done carefully it is in fact
allowed.

   https://github.com/search?q=org%3Acran%20mt19937&type=code

Hope this helps, and allow me to mention that there is also the rcpp-devel
for more Rcpp-specific questions.

Cheers, Dirk


-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-21 Thread Dirk Eddelbuettel


On 21 August 2024 at 15:47, Kurt Hornik wrote:
| > Kurt Hornik writes:
| 
| Committed now.

That is just *lovely*:

   > aut <- tools::CRAN_authors_db()
   > dim(aut)
   [1] 47433 7
   > head(aut)
 given  family email   orcid role 
comment   package
   1Martin   Bladtmartinbl...@math.ku.dk aut, cre   
  AalenJohansen
   2 Christian  Furrer fur...@math.ku.dk  aut   
  AalenJohansen
   3Sercan Kahveci sercan.kahv...@plus.ac.at aut, cre   
   AATtools
   4Andrew   Pilnyandy.pi...@uky.edu -0001-6603-5490 aut, cre   
abasequence
   5   Sigbert  Klinke  sigb...@hu-berlin.de aut, cre   
 abbreviate
   6  Csillery Katalin   kati.csill...@gmail.com  aut   
abc
   > 

Can we possibly get this into r-patched and the next r-release?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-21 Thread Dirk Eddelbuettel


On 21 August 2024 at 07:43, Dirk Eddelbuettel wrote:
| 
| On 20 August 2024 at 15:47, Kurt Hornik wrote:
| | >>>>> Kurt Hornik writes:
| | 
| | The variant attaches drops the URL and does unique.
| 
| Nice. Alas, some of us default to r-release as the daily driver and then
| 
|   Error in unname(tools:::.ORCID_iD_canonicalize(o)) : 
| object '.ORCID_iD_canonicalize' not found
|   > 
| 
| Will play with my 'RD' which I keep approximately 'weekly-current'. Quick
| rebuild first.

As simple as adding

  .ORCID_iD_canonicalize <- function (x) sub(tools:::.ORCID_iD_variants_regexp, 
"\\3", x)

and making the call (or maybe making it a lambda anyway ...)

  oid = unname(.ORCID_iD_canonicalize(o)))

After adding

  a <- sort_by(a, ~ a$family + a$given)

the first 48 out if a (currently) total of 6465 are empty for family.

  > sum(a$family == "")
  [1] 48
  > 

Rest is great!

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-21 Thread Dirk Eddelbuettel


On 20 August 2024 at 15:47, Kurt Hornik wrote:
| > Kurt Hornik writes:
| 
| The variant attaches drops the URL and does unique.

Nice. Alas, some of us default to r-release as the daily driver and then

  Error in unname(tools:::.ORCID_iD_canonicalize(o)) : 
object '.ORCID_iD_canonicalize' not found
  > 

Will play with my 'RD' which I keep approximately 'weekly-current'. Quick
rebuild first.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-20 Thread Dirk Eddelbuettel


On 20 August 2024 at 07:57, Dirk Eddelbuettel wrote:
| 
| Hi Kurt,
| 
| On 20 August 2024 at 14:29, Kurt Hornik wrote:
| | I think for now you could use something like what I attach below.
| | 
| | Not ideal: I had not too long ago starting adding orcidtools.R to tools,
| | which e.g. has .persons_from_metadata(), but that works on the unpacked
| | sources and not the CRAN package db.  Need to think about that ...
| 
| We need something like that too as I fat-fingered the string 'ORCID'. See
| fortune::fortunes("Dirk can type").
| 
| Will the function below later. Many thanks for sending it along.

Very nice. Resisted my common impulse to make it a data.table for easy
sorting via keys etc.  After running your code the line

   head(with(a, sort_by(a, ~ family + given)), 100)

shows that we need a bit more QA as person entries are not properly split
between 'family' and 'given', use the URL and that we have repeats.
Excluding those is next.

Dirk
 
| Dirk
| 
| | 
| | Best
| | -k
| | 
| | 
| | x <- tools::CRAN_package_db()
| | a <- lapply(x[["Authors@R"]],
| | function(a) {
| | if(!is.na(a)) {
| | a <- tryCatch(utils:::.read_authors_at_R_field(a), 
| |   error = identity)
| | if (inherits(a, "person")) 
| | return(a)
| | }
| | NULL
| | })
| | a <- do.call(c, a)
| | a <- lapply(a,
| | function(e) {
| | if(is.null(o <- e$comment["ORCID"]) || is.na(o))
| | return(NULL)
| | cbind(given = paste(e$given, collapse = " "),
| |   family = paste(e$family, collapse = " "),
| |   oid = unname(o))
| | })
| | a <- as.data.frame(do.call(rbind, a))
| | 
| | 
| | > Salut Thierry,
| | 
| | > On 20 August 2024 at 13:43, Thierry Onkelinx wrote:
| | > | Happy to help. I'm working on a new version of the checklist package. I 
could
| | > | export the function if that makes it easier for you.
| | 
| | > Would be happy to help / iterate. Can you take a stab at making the
| | > per-column split more robust so that we can bulk-process all non-NA 
entries
| | > of the returned db?
| | 
| | > Best, Dirk
| | 
| | > -- 
| | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| 
| -- 
| dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-20 Thread Dirk Eddelbuettel


On 20 August 2024 at 15:13, Chris Evans wrote:
| As I think that should be
| 
| fortunes::fortune("Dirk can type")
| 
| rather than
| 
| fortune::fortunes("Dirk can type")

Yes, thank you. I also failed to run that post through CI and testing before
sending.  Doing too many things at once...

Dirk
 
| I think that has become both recursive and demonstrating excellent 
| test-retest stability!  Oh boy do I know that issue!
| 
| Chris
| 
| On 20/08/2024 14:57, Dirk Eddelbuettel wrote:
| > Hi Kurt,
| >
| > On 20 August 2024 at 14:29, Kurt Hornik wrote:
| > | I think for now you could use something like what I attach below.
| > |
| > | Not ideal: I had not too long ago starting adding orcidtools.R to tools,
| > | which e.g. has .persons_from_metadata(), but that works on the unpacked
| > | sources and not the CRAN package db.  Need to think about that ...
| >
| > We need something like that too as I fat-fingered the string 'ORCID'. See
| > fortune::fortunes("Dirk can type").
| >
| > Will the function below later. Many thanks for sending it along.
| >
| > Dirk
| >
| > |
| > | Best
| > | -k
| > |
| > | 
| > | x <- tools::CRAN_package_db()
| > | a <- lapply(x[["Authors@R"]],
| > | function(a) {
| > | if(!is.na(a)) {
| > | a <- tryCatch(utils:::.read_authors_at_R_field(a),
| > |   error = identity)
| > | if (inherits(a, "person"))
| > | return(a)
| > | }
| > | NULL
| > | })
| > | a <- do.call(c, a)
| > | a <- lapply(a,
| > | function(e) {
| > | if(is.null(o <- e$comment["ORCID"]) || is.na(o))
| > | return(NULL)
| > | cbind(given = paste(e$given, collapse = " "),
| > |   family = paste(e$family, collapse = " "),
| > |   oid = unname(o))
| > | })
| > | a <- as.data.frame(do.call(rbind, a))
| > | 
| > |
| > | > Salut Thierry,
| > |
| > | > On 20 August 2024 at 13:43, Thierry Onkelinx wrote:
| > | > | Happy to help. I'm working on a new version of the checklist package. 
I could
| > | > | export the function if that makes it easier for you.
| > |
| > | > Would be happy to help / iterate. Can you take a stab at making the
| > | > per-column split more robust so that we can bulk-process all non-NA 
entries
| > | > of the returned db?
| > |
| > | > Best, Dirk
| > |
| > | > --
| > | > dirk.eddelbuettel.com | @eddelbuettel |e...@debian.org
| >
| -- 
| Chris Evans (he/him)
| Visiting Professor, UDLA, Quito, Ecuador & Honorary Professor, 
| University of Roehampton, London, UK.
| Work web site: https://www.psyctc.org/psyctc/
| CORE site: http://www.coresystemtrust.org.uk/
| Personal site: https://www.psyctc.org/pelerinage2016/
| Emeetings (Thursdays): 
| https://www.psyctc.org/psyctc/booking-meetings-with-me/
| (Beware: French time, generally an hour ahead of UK)
| <https://ombook.psyctc.org/book>
|   [[alternative HTML version deleted]]
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-20 Thread Dirk Eddelbuettel


Hi Kurt,

On 20 August 2024 at 14:29, Kurt Hornik wrote:
| I think for now you could use something like what I attach below.
| 
| Not ideal: I had not too long ago starting adding orcidtools.R to tools,
| which e.g. has .persons_from_metadata(), but that works on the unpacked
| sources and not the CRAN package db.  Need to think about that ...

We need something like that too as I fat-fingered the string 'ORCID'. See
fortune::fortunes("Dirk can type").

Will the function below later. Many thanks for sending it along.

Dirk

| 
| Best
| -k
| 
| 
| x <- tools::CRAN_package_db()
| a <- lapply(x[["Authors@R"]],
| function(a) {
| if(!is.na(a)) {
| a <- tryCatch(utils:::.read_authors_at_R_field(a), 
|   error = identity)
| if (inherits(a, "person")) 
| return(a)
| }
| NULL
| })
| a <- do.call(c, a)
| a <- lapply(a,
| function(e) {
| if(is.null(o <- e$comment["ORCID"]) || is.na(o))
| return(NULL)
| cbind(given = paste(e$given, collapse = " "),
|   family = paste(e$family, collapse = " "),
|   oid = unname(o))
| })
| a <- as.data.frame(do.call(rbind, a))
| 
| 
| > Salut Thierry,
| 
| > On 20 August 2024 at 13:43, Thierry Onkelinx wrote:
| > | Happy to help. I'm working on a new version of the checklist package. I 
could
| > | export the function if that makes it easier for you.
| 
| > Would be happy to help / iterate. Can you take a stab at making the
| > per-column split more robust so that we can bulk-process all non-NA entries
| > of the returned db?
| 
| > Best, Dirk
| 
| > -- 
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-20 Thread Dirk Eddelbuettel


Salut Thierry,

On 20 August 2024 at 13:43, Thierry Onkelinx wrote:
| Happy to help. I'm working on a new version of the checklist package. I could
| export the function if that makes it easier for you.

Would be happy to help / iterate. Can you take a stab at making the
per-column split more robust so that we can bulk-process all non-NA entries
of the returned db?

Best, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-19 Thread Dirk Eddelbuettel


On 19 August 2024 at 15:15, Thierry Onkelinx wrote:
| Maybe checklist:::author2df() might be useful. It is an unexported function
| from my checklist package. It converts a person() object to a dataframe.
| 
https://github.com/inbo/checklist/blob/5649985b58693acb88337873ae14a7d5bc018d96
| /R/store_authors.R#L38
| 
| df <- tools::CRAN_package_db()
| lapply(
|   df$`Authors@R`[df$Package  %in% c("git2rdata", "qrcode")],
|   function(x) {
|     parse(text = x) |>
|       eval() |>
|       vapply(checklist:::author2df, vector(mode = "list", 1)) |>
|       do.call(what = rbind)
|   }
| )
| 
| 
| [[1]]
| given   family   email   orcid 
affiliation usage
| 1 Thierry Onkelinxthierry.onkel...@inbo.be -0001-8804-4216
 1
| 2  Floris Vanderhaeghe floris.vanderhae...@inbo.be -0002-6378-6229
 1
| 3   Peter   Desmetpeter.des...@inbo.be -0002-8442-8025
 1
| 4 Els Lommelenels.lomme...@inbo.be -0002-3481-5684
 1
| 
| [[2]]
| given   family email   orcid affiliation usage
| 1 Thierry Onkelinx qrc...@muscardinus.be -0001-8804-4216 1
| 2  Victor  Teh   victor...@gmail.com  

That's a very nice start, thank you. (Will also look more closely at
checklist.)  It needs an `na.omit()` or alike, and even with that `rbind`
barked a few entries in (i = 19 if you select the full vector right now).

But definitely something to play with and possibly build upon. Thanks!  (And
the IDs of Floris and you were two of the ones I 'manually' added to a
DESCRIPTION file ;-)

Best,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] ORCID ID finder via tools::CRAN_package_db() ?

2024-08-19 Thread Dirk Eddelbuettel


Has anybody written a quick helper function that extracts the Authors@R field
from tools::CRAN_package_db() and 'stems' it into 'Name, Firstname, ORCID'
which one could use to look up ORCID IDs at CRAN? The lookup at orcid.org
sometimes gives us 'private entries' that make it harder / impossible to
confirm a match. Having a normalised matrix or data.frame (or ...) would also
make it easier to generate Authors@R.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Build process generated non-portable files

2024-08-17 Thread Dirk Eddelbuettel


On 17 August 2024 at 17:22, Ivan Krylov via R-package-devel wrote:
| В Fri, 16 Aug 2024 18:53:55 +
| anj5x...@nilly.addy.io пишет:
| 
| > In the past other packages have failed to build and not only on the
| > intel container see
| > 
"https://github.com/SpeakEasy-2/speakeasyR/actions/runs/10202337528/job/28226219457";
| > where several containers failed at the setup-deps step. There is
| > overlap in which package fails (i.e. protGenerics and sparseArray
| > fail in multiple containers but succeed in others while in one
| > container ExperimentHub fails). It seems the only packages failing
| > are from Bioconductor. Assume this is a bioconductor or pak issue.
| 
| Could also be an rhub issue, although unlike the igraph problem below,
| I have no idea where to start diagnosing it.
| 
| > > > igraph::sample_pref(10)  
| > > Error in dyn.load(file, DLLpath = DLLpath, ...) :
| > >  unable to load shared object
| > > '/root/R/x86_64-pc-linux-gnu-library/4.5/igraph/libs/igraph.so':
| > > libopenblasp.so.0: cannot open shared object file: No such file or
| > > directory  
| > 
| > I.e. the same error with building targets. I can raise an issue on
| > rigraph as well.
| 
| This is a problem with the binary package used by rhub. If you
| reinstall the source package from CRAN instead of
| https://github.com/r-hub/repos and
| https://github.com/cran/igraph/releases/, it will work, but take much
| more time compiling the package:
| 
| options(repos = getOption('repos')['CRAN'])
| install.packages('igraph')

The r2u binaries can offer help here for running on Ubuntu. They are a
'superset' of the same p3m binaries but aim to (and generally manage to)
provide working binaries. I just validated via a Docker container running it:

edd@rob:~$ docker run --rm -ti rocker/r2u:jammy bash
root@64a8b23a9bc7:/# install.r igraph# install.packages() works too
[ ... log of installation of 14 binaries omitted here ... ]
root@64a8b23a9bc7:/# R

R version 4.4.1 (2024-06-14) -- "Race for Your Life"
Copyright (C) 2024 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> library(igraph)

Attaching package: ‘igraph’

The following objects are masked from ‘package:stats’:

decompose, spectrum

The following object is masked from ‘package:base’:

union

> 


Best, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] rhub (version 2) baulking at building a vignette.

2024-08-02 Thread Dirk Eddelbuettel


On 2 August 2024 at 16:12, Ivan Krylov via R-package-devel wrote:
| В Fri,  2 Aug 2024 11:10:59 +
| Rolf Turner  пишет:
| 
| > The advice was to the effect that if the vignette was actually just a
| > *.tex file, one could (after putting in some preparatory code, in the
| > form of comments, before the documentclass command) pre-process the
| > file (with pdflatex) and the build would use the *.pdf file that was
| > created, without any processing being required.
| 
| I think that you are looking for the "asis" vignette engine from the

Or the zero-dependency approach by Mark van der Loo of embedding a whole
(pre-made pdf) in a five-line Rnw sweave file described in 
  https://www.r-bloggers.com/2019/01/add-a-static-pdf-vignette-to-an-r-package/

So no real pdf processing requirement (besides the minimal include), so no
random failing over a .sty or font one uses suddenly missing on another machine.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R CMD BATCH plot output

2024-07-29 Thread Dirk Eddelbuettel


On 29 July 2024 at 13:02, Ivan Krylov wrote:
| On Sun, 28 Jul 2024 15:27:33 -0500
| Dirk Eddelbuettel  wrote:
| 
| > If we cannot (or do not want to) modify the given main.R, I would
| > suggest something along the lines of
| > 
| >   Rscript -e 'pdf(myfilenamevar); source("main.R"); dev.off()' | tee
| > 2024-07-28.log
| 
| Perhaps even options(device = \(file = myfilenamevar, ...) pdf(file =
| file, ...)) so that every plot would get the same treatment, though
| that requires re-implementing the dev.new() logic to guess an available
| file name. You can even misuse R_PROFILE_USER to inject the code into
| the R CMD BATCH session:
| 
| # myplots.R:
| local({
|   cmdline <- commandArgs(FALSE)
|   srcfile <- cmdline[[which(cmdline == '-f')[[1]] + 1]]
|   plotfile <- sub('(\\.R)?$', '', srcfile)
|   options(device = \(file, ...) {
|   i <- 1
|   if (missing(file)) repeat {
|   file <- sprintf('%s_%03d.pdf', plotfile, i)
|   if (!file.exists(file)) break
|   i <- i+1
|   }
|   pdf(file = file, ...)
|   })
| })
| 
| # example.R:
| plot(1:100 / 10, sin(1:100 / 10))
| dev.off()
| plot(1:100 / 10, cos(1:100 / 10))
| 
| R_PROFILE_USER=myplots.R R CMD BATCH example.R
| # produces example.Rout, example_001.pdf, example_002.pdf

Impressive. A very creative way to inject a modification into the (to my
taste overly restrictive) setup provided by R CMD BATCH.

Personally I would much rather script something cleaner with r or Rscript but
we all have our preferences.

Dirk


-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R CMD BATCH plot output

2024-07-28 Thread Dirk Eddelbuettel


On 28 July 2024 at 15:44, Duncan Murdoch wrote:
| On 2024-07-28 1:48 p.m., Josiah Parry wrote:
| > However, if plots are generated in the process, the plots are stored in
| > Rplots.pdf.
| > 
| > Is there a way via command line arguments to change the name of the pdf
| > output.
| > 
| > There might be multiple runs of this script and it would be ideal to store
| > the plot output independently.
| 
| That's the default filename if you open a PDF device.  Open it

It's the default (and fallback) device and filename you fail to specify
something else. Ie 'Rscript -e "plot(1:10)"' will create it too.

My preferred alternative is to wrap call to plot() with actual device create
(something like 'if (!interactive()) pdf(my_filename, 8. 6)' with a
corresponding 'if (!interactive()) dev.off()' to ensure the file finalised
and close.

If we cannot (or do not want to) modify the given main.R, I would suggest
something along the lines of

  Rscript -e 'pdf(myfilenamevar); source("main.R"); dev.off()' | tee 
2024-07-28.log

and Bob's your uncle now in terms of how you spec the filename.

In short, I would let go of `R CMD BATCH` if it does not do what you want.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to get arbitrary precise inputs from R for an Rcpp package?

2024-07-18 Thread Dirk Eddelbuettel


Hi Khue,

On 19 July 2024 at 06:29, Khue Tran wrote:
| I am currently trying to get precise inputs by taking strings instead of
| numbers then writing a function to decompose the string into a rational
| with the denominator in the form of 10^(-n) where n is the number of
| decimal places. I am not sure if this is the only way or if there is a
| better method out there that I do not know of, so if you can think of a
| general way to get precise inputs from users, it will be greatly
| appreciated!

That is one possible way. The constraint really is that the .Call() interface
we use for all [1] extensions to R only knowns SEXP types which map to a
small set of known types: double, int, string, bool, ...  The type used by
the Boost library you are using is not among them, so you have to add code to
map back and forth. Rcpp makes that easier; it is still far from automatic.

R has packages such as Rmpfr interfacing GNU MPFR based on GMP. Maybe that is
good enough?  Also note that Rcpp has a dedicated (low volume and friendly)
mailing list where questions such as this one may be better suited.

Cheers, Dirk

[1] A slight generalisation. There are others but they are less common / not
recommended.

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] No email with confirmation link on resubmission of package

2024-06-07 Thread Dirk Eddelbuettel


On 6 June 2024 at 04:47, Paul Kabaila wrote:
| When I resubmitted, I didn't realise that I needed to change the version 
number.

As this comes up every now and then: This is still a _soft_ requirement. CRAN
does not 'cache' what versions you used in uploads. I have often reiterated
with the same number.

And small add if you like the command-line (in a responsive way) better than
a browser, package `ciw` on CRAN and the example script ciw.r (in `littler`,
only at github yet) make 'spying' at incoming quite easy and
convenient. Right now and showing the default view as well as some options:

edd@rob:~$ ciw.r
 FolderNameTime   Size   Age
  
 1: pretest   ppgm_1.0.1.tar.gz 2024-06-07 15:55:00   2.7M0.25 hours
 2: pretest  AirExposure_1.0.tar.gz 2024-06-07 15:47:00   5.3M0.38 hours
 3: recheckcna_3.6.0.tar.gz 2024-06-07 15:36:00   915K0.57 hours
 4: inspectadmiral_1.1.0.tar.gz 2024-06-07 14:27:00   1.5M1.72 hours
 5: waiting TreeTools_1.11.1.tar.gz 2024-06-07 07:37:00   938K8.55 hours
 6: pending  robusTest_1.1.0.tar.gz 2024-06-06 23:27:00   2.8M   16.72 hours
 7: waiting  boot_1.3-31.tar.gz 2024-06-06 22:17:00   228K   17.88 hours
 8: pending   kit_0.0.18.tar.gz 2024-06-06 20:01:0089K   20.15 hours
 9: inspectgeodata_0.6-2.tar.gz 2024-06-06 12:00:0055K   28.17 hours
10: pending duckdb_1.0.0.tar.gz 2024-06-06 08:15:00   4.0M   31.92 hours
11: waiting  hdsvm_1.0.1.tar.gz 2024-06-06 04:08:0035K   36.03 hours
12: waiting   hdqr_1.0.1.tar.gz 2024-06-06 03:49:0035K   36.35 hours
13: waiting  cofid_1.0.1.tar.gz 2024-06-03 21:53:00   1.2M   90.28 hours
14: waiting Matrix_1.7-0.tar.gz 2024-03-20 17:41:00   2.3M 1893.48 hours
edd@rob:~$ ciw.r --help
Usage: ciw.r [-h] [-x] [-a] [-m] [-i] [-t] [-p] [-w] [-r] [-s] [-n] [-u] [-l 
rows] [-z] [ARG...]

-m --mega   use 'mega' mode of all folders (see --usage)
-i --inspectvisit 'inspect' folder
-t --pretestvisit 'pretest' folder
-p --pendingvisit 'pending' folder
-w --waitingvisit 'waiting' folder
-r --recheckvisit 'recheck' folder
-a --archivevisit 'archive' folder
-n --newbiesvisit 'newbies' folder
-u --publishvisit 'publish' folder
-s --skipsort   skip sorting of aggregate results by age
-l --lines rows print top 'rows' of the result object [default: 50]
-z --ping   run the connectivity check first
-h --help   show this help text
-x --usage  show help and short example usage 
edd@rob:~$

Calling `ciw::ciw()` from your R prompt of course also works (if that is your 
jam).

Dirk


| 
| I changed the version number of my package from 1.2.0 to 1.2.1 and the 
resubmission worked fine.
| 
| Paul Kabaila
| 
| From: R-package-devel  On Behalf Of 
Ben Bolker
| Sent: Thursday, June 6, 2024 8:08 AM
| To: r-package-devel@r-project.org
| Subject: Re: [R-pkg-devel] No email with confirmation link on resubmission of 
package
| 
| You don't often get email from bbol...@gmail.com. 
Learn why this is important
| Check your spam folder(s) ?
| 
| Try the web form 
>
 just in
| case something is wonky with devtools::submit_cran() ?
| 
| 
| 
| On 2024-06-05 4:19 a.m., Paul Kabaila wrote:
| > (1) Today, I submitted my R package to CRAN using
| > devtools::submit_cran()
| > which resulted in an email from
| > CRAN Package Submission Form 
cransub...@xmbombadil.wu.ac.at>
| > with a confirmation link, which I clicked on.
| >
| > (2) I was then sent an email from
| > 
lig...@statistik.tu-dortmund.de>
| > which notified that my package "does not pass the incoming checks 
automatically",
| > with 0 errors, 0 warnings and 2 notes.
| >
| >
| > (3) I then fixed the problem for one of the notes and explained the reason 
for the
| > other note and put this information into the file
| > cran-comments.md
| > Then I resubmitted my R package to CRAN using
| > devtools::submit_cran()
| >
| > However, this time I did not receive an email with a confirmation link.
| >
| > Has my package been resubmitted or do I need to resubmit it in some 
different way?
| >
| > Paul Kabaila
| >
| >
| > La Trobe University | TEQSA PRV12132 - Australian University | CRICOS 
Provider 00115M
| >
| > [[alternative HTML version deleted]]
| >
| > __
| > R-package-devel@r-project.org mailing 
list
| > 
https://stat.ethz.ch/mailman/listinfo/r-package-devel

Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-26 Thread Dirk Eddelbuettel


On 26 May 2024 at 13:31, Kurt Hornik wrote:
| >>>>> Dirk Eddelbuettel writes:
| 
| > Kurt,
| 
| > Could you do me a favour and run on that clang18-using machine in question
| > the following one-liner (provided your session has access to a .libPaths()
| > including Rcpp) and, in the case of success, the resulting function?
| 
| I can: 
| 
| R> Rcpp::cppFunction("int ompconfigtest() { return omp_get_num_threads(); }", 
includes="#include ", plugin="openmp")
| Warning in Rcpp::cppFunction("int ompconfigtest() { return 
omp_get_num_threads(); }",  :
|   partial argument match of 'plugin' to 'plugins'
| R>  ompconfigtest()
| [1] 1

Hmpf.

| The system has OpenMP, but R was configured not to use it.

Again, it would be lovely if we could _query_ that.

| In general, packages should leave the decision to use OpenMP to the
| *user*, who can use their own Makevars files to override the R system
| Makevars SHLIB_OPENMP_* settings as desired.

Didn't help here, did it?  Or maybe Rcpp was too eager with 

   .plugins[["openmp"]] <- function() {
   list(env = list(PKG_CXXFLAGS="-fopenmp",
   PKG_LIBS="-fopenmp"))
   }

and we'd need something like this (untested)

   .plugins[["openmp"]] <- function() {
   list(env = list(PKG_CXXFLAGS=Sys.getenv("SHLIB_OPENMP_CXXFLAGS, ""),
   PKG_LIBS=Sys.getenv("SHLIB_OPENMP_CXXFLAGS, "")))
   }

Actually, does not work it does come into the R session. Only goes to `make`.

So I got nothing. Sorry.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] gcc14 checks on fedora

2024-05-24 Thread Dirk Eddelbuettel


On 24 May 2024 at 20:01, Brad Eck wrote:
| I received a note that my package -- epanet2toolkit -- was showing a
| warning in the fedora-gcc results on CRAN.
| https://www.stats.ox.ac.uk/pub/bdr/gcc12/epanet2toolkit.out
| 
| I'd like to reproduce the warning and fix it.  Usually I'd do that with
| Docker.  Is anyone aware of a docker image or dockerfile with this setup?

By habit I usually try to see if what I need is already in Debian unstable
(or experimental if I must). So I just fired up 'docker run --rm -ti
rocker/r-base bash' and that one (which I stand behind) does have
gcc-14. Twice, infact, as 14.1.0 in unstable and 14-20240330 in testing.

There are also some efforts to provide containers of the (generally not
"exported") setup(s) at CRAN which may by now have this.  The last time I
looked for it it either wasn't quite there yet or fell short for another
reason.

Good luck,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-24 Thread Dirk Eddelbuettel


Kurt,

Could you do me a favour and run on that clang18-using machine in question
the following one-liner (provided your session has access to a .libPaths()
including Rcpp) and, in the case of success, the resulting function?

  > Rcpp::cppFunction("int ompconfigtest() { return omp_get_num_threads(); }", 
includes="#include ", plugin="openmp")
  > ompconfigtest()
  [1] 1
  > 

On a "normal" development machine such as mine here it works.

Presumably it will fail at your end because -fopenmp will not be / cannot be
substituted in from the openmp plugin defined by Rcpp:

  ## built-in OpenMP plugin
  .plugins[["openmp"]] <- function() {
  list(env = list(PKG_CXXFLAGS="-fopenmp",
  PKG_LIBS="-fopenmp"))
  }

but it would be nice to know if that does indeed fail.

Thanks,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-23 Thread Dirk Eddelbuettel


On 23 May 2024 at 20:02, Ivan Krylov wrote:
| On Wed, 22 May 2024 09:18:13 -0500
| Dirk Eddelbuettel  wrote:
| 
| > Testing via 'nm' as you show is possible but not exactly 'portable'.
| > So any suggestions as to what to condition on here?
| 
| (My apologies if you already got an answer from Kurt. I think we're not
| seeing his mails to the list.)
| 
| Perhaps take the configure test a bit further and try to dyn.load() the
| resulting shared object? To be extra sure, call the function that uses
| the OpenMP features? (Some weird systems may have lazy binding enabled,
| making dyn.load() succeed but crashing the process on invocation of a
| missing function.)
| 
| On GNU/Linux, the linker will happily leave undefined symbols in when
| creating a shared library (unlike on, say, Windows, where extern void
| foo(void); foo(); is a link-time error unless an object file or an
| import library providing foo() is also present). When loading such a
| library, the operation fails unless the missing symbols are already
| present in the address space of the process (e.g. from a different
| shared library).
| 
| A fresh process of R built without OpenMP support will neither link in
| the OpenMP runtime while running SHLIB nor have the OpenMP runtime
| loaded and so should successfully fail the test.
| 
| I also wouldn't call the entry point "main" just in case some future
| compiler considers this a violation of the rules™ [*] and breaks the
| code. extern "C" void configtest(int*) would be compatible with .C()
| without having to talk to R's memory manager:
| 
| # The configure script:
| cat > test-omp.cpp <
| extern "C" void configtest(int * arg) {
|   *arg = omp_get_num_threads();
| }
| EOF
| # Without the following you're relying on the GNU/Linux-like behaviour
| # w.r.t. undefined symbols (see WRE 1.2.1.1):
| cat > Makevars <https://en.cppreference.com/w/cpp/language/main_function

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-23 Thread Dirk Eddelbuettel


On 22 May 2024 at 14:03, Duncan Murdoch wrote:
| On 2024-05-22 10:18 a.m., Dirk Eddelbuettel wrote:
| > 
| > On 22 May 2024 at 13:54, Nixon, Michelle Pistner wrote:
| > | Thank you both for your responses and help! Kurt-- your message makes a 
lot of
| > | sense. I'll try to debug soon and will reach out if I have more questions.
| > 
| > Interesting.
| > 
| > Kurt, is there a recommended way to test for this (rare, I may add) case of
| > 'OpenMP present but usage verboten by R' ?  I do not see an option for 'R 
CMD
| > config' jumping out, and there is no pkg-config file either.
| > 
| > Testing via 'nm' as you show is possible but not exactly 'portable'.  So any
| > suggestions as to what to condition on here?  Michelle did AFAICT the Right
| > Thing (TM) by 'borrowing' from the fairly mature check in RcppArmadillo.
| 
| Not that I know much about writing configure tests, but won't 
| src/include/config.h in the R build dir have "HAVE_OPENMP" defined if
   ^^
| OpenMP is supported?

I think what Michelle and I are after is for an _installed_ version of R to
tell us if we can, or cannot, use OpenMP.  Maybe capabilities() needs and 
extension?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-22 Thread Dirk Eddelbuettel


On 22 May 2024 at 13:54, Nixon, Michelle Pistner wrote:
| Thank you both for your responses and help! Kurt-- your message makes a lot of
| sense. I'll try to debug soon and will reach out if I have more questions.

Interesting.

Kurt, is there a recommended way to test for this (rare, I may add) case of
'OpenMP present but usage verboten by R' ?  I do not see an option for 'R CMD
config' jumping out, and there is no pkg-config file either.

Testing via 'nm' as you show is possible but not exactly 'portable'.  So any
suggestions as to what to condition on here?  Michelle did AFAICT the Right
Thing (TM) by 'borrowing' from the fairly mature check in RcppArmadillo.

Dirk

| 
| Thanks,
| Michelle
| 
━━━
| From: Kurt Hornik 
| Sent: Wednesday, May 22, 2024 3:57 AM
| To: Dirk Eddelbuettel 
| Cc: Nixon, Michelle Pistner ; r-package-devel@r-project.org
| ; Kurt Hornik 
| Subject: Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang
| with OpenMP
|  
| [You don't often get email from kurt.hor...@wu.ac.at. Learn why this is
| important at https://aka.ms/LearnAboutSenderIdentification ]
| 
| >>>>> Dirk Eddelbuettel writes:
| 
| Friends, the Debian pretest check system uses LLVM 18 and has the
| corresponding OpenMP headers and libraries installed, but R is
| configured not to use these.  However, the configure test in fido does
| 
| cat < test-omp.cpp
| #include 
| int main() {
|   return omp_get_num_threads();
| }
| EOF
| 
| ## Execute R CMD SHLIB.
| "${R_HOME}/bin/R" CMD SHLIB test-omp.cpp >/dev/null 2>&1
| if test x"$?" = x"0"; then
| AC_MSG_RESULT([yes])
| openmp_already_works="yes"
| else
| AC_MSG_RESULT([no])
| fi
| 
| which does not work as intended: R CMD SHLIB will happily link but the
| .so will end up with an unresolved symbol:
| 
| $ nm test-omp.so | grep omp_
|  U omp_get_num_threads
| 
| Hth
| -k
| 
| > Hi Michelle,
| 
| > On 21 May 2024 at 13:46, Nixon, Michelle Pistner wrote:
| > | Hi all,
| > |
| > | I'm running into build issues for my package (fido: https://
| nam10.safelinks.protection.outlook.com/?url=
| https%3A%2F%2Fgithub.com%2Fjsilve24%2Ffido&data=
| 
05%7C02%7Cmap5672%40psu.edu%7C7993944f52ae4a017d3d08dc7a34d1d6%7C7cf48d453ddb4389a9c1c115526eb52e%7C0%7C0%7C638519614500263726%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C
| &sdata=C4BI6%2FUgTjd%2FCRPzBF8NuUkPvlQkIxb2r1%2BzIqKNPpE%3D&reserved=0) on the
| r-devel-linux-x86_64-debian-clang system on CRAN (full check log here: 
https://
| nam10.safelinks.protection.outlook.com/?url=
| 
https%3A%2F%2Fwin-builder.r-project.org%2Fincoming_pretest%2Ffido_1.1.0_20240515_211644%2FDebian%2F00install.out
| &data=
| 
05%7C02%7Cmap5672%40psu.edu%7C7993944f52ae4a017d3d08dc7a34d1d6%7C7cf48d453ddb4389a9c1c115526eb52e%7C0%7C0%7C638519614500272862%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C
| &sdata=TKpQrkImemwz%2FuK7yYqCLEECU6WCCvgYwcrAId%2Bpmb0%3D&reserved=0). fido
| relies on several of the Rcpp packages, and I think the error is due to how
| OpenMP is set up in our package. The error in question states:
| > |
| > | "Error: package or namespace load failed for �fido� in dyn.load(file,
| DLLpath = DLLpath, ...):
| > |  unable to load shared object 
'/home/hornik/tmp/R.check/r-devel-clang/Work/
| build/Packages/00LOCK-fido/00new/fido/libs/fido.so':
| > |   /home/hornik/tmp/R.check/r-devel-clang/Work/build/Packages/00LOCK-fido/
| 00new/fido/libs/fido.so: undefined symbol: omp_get_thread_num"
| > |
| > | I've had a hard time recreating the error, as I can successfully get the
| package to build on other systems (GitHub action results here: https://
| nam10.safelinks.protection.outlook.com/?url=
| https%3A%2F%2Fgithub.com%2Fjsilve24%2Ffido%2Factions&data=
| 
05%7C02%7Cmap5672%40psu.edu%7C7993944f52ae4a017d3d08dc7a34d1d6%7C7cf48d453ddb4389a9c1c115526eb52e%7C0%7C0%7C638519614500275695%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C
| &sdata=zXCul23gWAevfEhCsfyJg8KewU8fjSuy1qZgZemNy7M%3D&reserved=0) including a
| system using the same version of R/clang as the failing CRAN check. Looking at
| the logs between the two, the major difference is the lack of -fopenmp in the
| compiling function on the CRAN version (which is there on the r-hub check
| version with the same specifications):
| > |
| > | (From the CRAN version) clang++-18 -std=gnu++17 -shared 
-L/home/hornik/tmp/
| R-d-clang-18/lib -Wl,-O1 -o fido.so ConjugateLinearModel.o
| MaltipooCollapsed_LGH.o MaltipooCollapsed_Optim.o MatrixAlgebra.o
| PibbleCollapsed_LGH.o PibbleCollapsed_Opti

Re: [R-pkg-devel] handling documentation build tools

2024-05-21 Thread Dirk Eddelbuettel


As lyx is not listed in 'Writing R Extensions', the one (authorative) manual
describing how to build packages for R, I would not assume it to be present
on every CRAN machine building packages. Also note that several user recently
had to ask here how to deal with less common fonts for style files for
(pdf)latex.

So I would recommend 'localising' the pdf creation to your own machine, and
to ship the resulting pdf. You can have pre-made pdfs as core of a vignette,
I trick I quite like to make package building simpler and more robust.  See
https://www.r-bloggers.com/2019/01/add-a-static-pdf-vignette-to-an-r-package/
for details.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-21 Thread Dirk Eddelbuettel

Hi Michelle,

On 21 May 2024 at 13:46, Nixon, Michelle Pistner wrote:
| Hi all,
| 
| I'm running into build issues for my package (fido: 
https://github.com/jsilve24/fido) on the r-devel-linux-x86_64-debian-clang 
system on CRAN (full check log here: 
https://win-builder.r-project.org/incoming_pretest/fido_1.1.0_20240515_211644/Debian/00install.out).
 fido relies on several of the Rcpp packages, and I think the error is due to 
how OpenMP is set up in our package. The error in question states:
| 
| "Error: package or namespace load failed for �fido� in dyn.load(file, DLLpath 
= DLLpath, ...):
|  unable to load shared object 
'/home/hornik/tmp/R.check/r-devel-clang/Work/build/Packages/00LOCK-fido/00new/fido/libs/fido.so':
|   
/home/hornik/tmp/R.check/r-devel-clang/Work/build/Packages/00LOCK-fido/00new/fido/libs/fido.so:
 undefined symbol: omp_get_thread_num"
| 
| I've had a hard time recreating the error, as I can successfully get the 
package to build on other systems (GitHub action results here: 
https://github.com/jsilve24/fido/actions) including a system using the same 
version of R/clang as the failing CRAN check. Looking at the logs between the 
two, the major difference is the lack of -fopenmp in the compiling function on 
the CRAN version (which is there on the r-hub check version with the same 
specifications):
| 
| (From the CRAN version) clang++-18 -std=gnu++17 -shared 
-L/home/hornik/tmp/R-d-clang-18/lib -Wl,-O1 -o fido.so ConjugateLinearModel.o 
MaltipooCollapsed_LGH.o MaltipooCollapsed_Optim.o MatrixAlgebra.o 
PibbleCollapsed_LGH.o PibbleCollapsed_Optim.o PibbleCollapsed_Uncollapse.o 
PibbleCollapsed_Uncollapse_sigmaKnown.o RcppExports.o SpecialFunctions.o 
test_LaplaceApproximation.o test_MultDirichletBoot.o test_utils.o 
-L/home/hornik/tmp/R-d-clang-18/lib -lR
| 
| My initial thought was an issue in the configure scripts (which we borrowed 
heavily from RcppArmadillo but made slight changes to (which is the most likely 
cause if there is issue here)) or that there is some mismatch somewhere as to 
whether or not OpenMP is available, but there isn't an obvious bug to me.
| 
| Any guidance on how to debug would be greatly appreciated!

I seem to recall that that machine is 'known-bad' for OpenMP due to the
reliance on clang-18 which cannot (?) build with it.  Might be best to
contact Kurt Hornik (CC'ed) and/or CRAN.

Best, Dirk

 
| Thanks,
| Michelle
| 
| Michelle Nixon, PhD
| 
| Assistant Research Professor
| College of Information Sciences and Technology
| The Pennsylvania State University
| 
|   [[alternative HTML version deleted]]
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN packages dependency on bioconductor packages

2024-05-16 Thread Dirk Eddelbuettel


On 16 May 2024 at 05:34, Duncan Murdoch wrote:
| I forget now, but presumably the thinking at the time was that Suggested 
| packages would always be available for building and checking vignettes.

Yes. I argued for years (cf https://dirk.eddelbuettel.com/blog/2017/03/22/
from seven (!!) years ago) and CRAN is slowly moving away from that implicit
'always there' guarantee to prefering explicit enumerations -- and now even
tests via the NoSuggests flavour.

As Uwe stated in this thread, having the vignette dependencies both in
Suggests as well as in the VignetteHeader should do. And it is the Right
Thing (TM) to do.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Overcoming CRAN's 5mb vendoring requirement

2024-05-09 Thread Dirk Eddelbuettel


Software Heritage (see [1] for their website and [2] for a brief intro I gave
at useR! 2019 in Toulouse) covers GitHub and CRAN [3]. It is by now 'in
collaboration with UNESCO', supported by a long and posh list of sponsors [4]
and about as good as it gets to 'ensure longevity of artifacts'.

It is of course not meant for downloads during frequent builds.

But given the 'quasi-institutional nature' and sponsorship, we could think of
using GitHub as an 'active cache'. But CRAN is CRAN and as it now stands
GitHub is not trusted.  ¯\_(ツ)_/¯

Dirk


[1] https://www.softwareheritage.org/
[2] https://dirk.eddelbuettel.com/papers/useR2019_swh_cran_talk.pdf
[3] https://www.softwareheritage.org/faq/ question 2.1
[4] https://www.softwareheritage.org/support/sponsors/
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Fast Matrix Serialization in R?

2024-05-08 Thread Dirk Eddelbuettel


On 9 May 2024 at 03:20, Sameh Abdulah wrote:
| I need to serialize and save a 20K x 20K matrix as a binary file.

Hm that is an incomplete specification: _what_ do you want to do with it?
Read it back in R?  Share it with other languages (like Python) ? I.e. what
really is your use case?  Also, you only seem to use readBin / writeBin. Why
not readRDS / saveRDS which at least give you compression?

If it is to read/write from / to R look into the qs package. It is good. The
README.md at its repo has benchmarks: https://github.com/traversc/qs If you
want to index into the stored data look into fst. Else also look at databases

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Overcoming CRAN's 5mb vendoring requirement

2024-05-08 Thread Dirk Eddelbuettel


On 8 May 2024 at 11:02, Josiah Parry wrote:
| CRAN has rejected this package with:
| 
| *   Size of tarball: 18099770 bytes*
| 
| *Please reudce to less than 5 MB for a CRAN package.*

Are you by chance confusing a NOTE (issued, but can be overruled) with a
WARNING (more severe, likely a must-be-addressed) or ERROR?

There are lots and lots of packages larger than 5mb -- see eg

   https://cran.r-project.org/src/contrib/?C=S;O=D

which has a top-5 of

   rcdklibs   19mb
   fastrmodels15mb
   prqlr  15mb
   RFlocalfdr 14mb
   acss.data  14mb

and at least one of those is also Rust-using and hence a possible template.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problem with loading package "devtools" from CRAN.

2024-04-29 Thread Dirk Eddelbuettel


On 30 April 2024 at 01:21, Rolf Turner wrote:
| On Mon, 29 Apr 2024 06:30:20 -0500
| Dirk Eddelbuettel  wrote:
| 
| 
| 
| > These days, I strongly recommend r2u [1].  As you already use R via
| > CRAN through apt, r2u adds one more repository after which _all_ R
| > packages are handled via the same apt operations that you already
| > trust to get you R from CRAN (as well as anything else on your
| > machine).  This covers all 20+ thousand CRAN packages along with 400
| > key BioC packages. Handling your packages with your system package
| > managed guarantees all dependencies are resolved reliably and
| > quickly. It makes installing, upgrading, managing CRAN package
| > easier, faster and more reliable.
| 
| 
| 
| > [1] https://eddelbuettel.github.io/r2u
| 
| 
| 
| Sounds promising, but I cannot follow what "r2u" is actually
| all about.  What *is* r2u?  And how do I go about using it?  Do I
| invoke it (or invoke something) from within R?  Or do I invoke
| something from the OS?  E.g. something like
| 
| sudo apt-get install 
| 
| ???

You could peruse the documentation at

  https://eddelbuettel.github.io/r2u

and / or the blogposts I have especially below

  https://dirk.eddelbuettel.com/blog/code/r4/

(and you may have to read 'in reverse order').

| I have downloaded the file add_cranapt_jammy.sh and executed
| 
|sudo sh add_cranapt_jammy.sh
| 
| which seemed to run OK.  What now?

Briefly, when you setup r2u you set up new a new apt repo AND a new way to
access them from R (using the lovely `bspm` package).  So in R saying
`install.packages("devtools")` will seamlessly fetch r-cran-devtools and
about 100 other files it depends upon (if you start from an 'empty' system as
I did in a container last eve before replying to you). That works in mere
seconds. You can then say `library(devtools)` as if you compiled locally.

Naturally, using binaries both way faster and easier when it works (as this
generally does). See the blog posts, see the demos, see the r2u site, try in
(risklessly !!) in a container or at gitpod or in continuous integration or
in codespaces or ...
 
The docs try to get to that. Maybe start small and aim `install.packages()`
at a package you know you do not have see what what happens?

Follow-ups may be more appropriate for r-sig-debian, and/or an issue ticket
at the r2u github repo depending on nature of the follow-up.

Good luck,  Dirk


-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problem with loading package "devtools" from CRAN.

2024-04-29 Thread Dirk Eddelbuettel


Rolf,

This question might have been more appropriate for r-sig-debian than here.
But as Simon noted, the lack of detail makes is difficult to say anything to
aid. It likely was an issue local to your setup and use.

These days, I strongly recommend r2u [1].  As you already use R via CRAN
through apt, r2u adds one more repository after which _all_ R packages are
handled via the same apt operations that you already trust to get you R from
CRAN (as well as anything else on your machine).  This covers all 20+
thousand CRAN packages along with 400 key BioC packages. Handling your
packages with your system package managed guarantees all dependencies are
resolved reliably and quickly. It makes installing, upgrading, managing CRAN
package easier, faster and more reliable.

To double-check, I just spot-checked 'devtools' on an r2u container (on top
of Ubuntu 22.04) and of course devtools install and runs fine (as a binary).
So maybe give r2u a go. "Sixteen million packages served" in two years ...

Cheers, Dirk

[1] https://eddelbuettel.github.io/r2u

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Order of repo access from options("repos")

2024-04-02 Thread Dirk Eddelbuettel


On 1 April 2024 at 17:44, Uwe Ligges wrote:
| Untested:
| 
| install.packages() calls available.packages() to find out which packages 
| are available - and passes a "filters" argument if supplied.
| That can be a user defined filter. It should be possible to write a user 
| defined filter which prefers the packages in your local repo.

Intriguing.  Presumably that would work for update.packages() too?

(We actually have a use case at work, and as one way out I created another
side-repo to place a package with an incremented version number so it would
'win' on hightest version; this is due to some non-trivial issues with the
underlying dependencies.)

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Order of repo access from options("repos")

2024-03-31 Thread Dirk Eddelbuettel


On 31 March 2024 at 11:43, Martin Morgan wrote:
| So all repositories are consulted and then the result filtered to contain just
| the most recent version of each. Does it matter then what order the
| repositories are visited?

Right. I fall for that too often, as I did here.  The order matters for
.libPaths() where the first match is use, for package install the highest
number (from any entry in getOption(repos)) wins.

Thanks for catching my thinko.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Order of repo access from options("repos")

2024-03-31 Thread Dirk Eddelbuettel


Greg,

There are AFAICT two issues here: how R unrolls the named vector that is the
'repos' element in the list 'options', and how your computer resolves DNS for
localhost vs 172.17.0.1.  I would try something like

   options(repos = c(CRAN = "http://localhost:3001/proxy";,
 C = "http://localhost:3002";,
 B = "http://localhost:3003/proxy";,
 A = "http://localhost:3004";))

or the equivalent with 172.17.0.1. When I do that here I get errors from
first to last as we expect:

   > options(repos = c(CRAN = "http://localhost:3001/proxy";,
 C = "http://localhost:3002";,
 B = "http://localhost:3003/proxy";,
 A = "http://localhost:3004";))
   > available.packages()
   Warning: unable to access index for repository 
http://localhost:3001/proxy/src/contrib:
 cannot open URL 'http://localhost:3001/proxy/src/contrib/PACKAGES'
   Warning: unable to access index for repository 
http://localhost:3002/src/contrib:
 cannot open URL 'http://localhost:3002/src/contrib/PACKAGES'
   Warning: unable to access index for repository 
http://localhost:3003/proxy/src/contrib:
 cannot open URL 'http://localhost:3003/proxy/src/contrib/PACKAGES'
   Warning: unable to access index for repository 
http://localhost:3004/src/contrib:
 cannot open URL 'http://localhost:3004/src/contrib/PACKAGES'
Package Version Priority Depends Imports LinkingTo Suggests Enhances 
License License_is_FOSS License_restricts_use OS_type Archs MD5sum 
NeedsCompilation File Repository
   > 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] using portable simd instructions

2024-03-27 Thread Dirk Eddelbuettel


On 27 March 2024 at 08:48, jesse koops wrote:
| Thank you, I was not aware of the easy way to search CRAN. I looked at
| rcppsimdjson of course, but couldn't figure it out since it is done in
| the simdjson library if interpret it correclty, not within the R
| ecosystem and I didn't know how that would change things. Writing R
| extensions assumes a lot of  prior knowledge so I will have to work my
| way up to there first.

I think I have (at least) one other package doing something like this _in the
library layer too_ as suggested by Tomas, namely crc32c as used by digest.
You could study how crc32c [0] does this for x86_64 and arm64 to get hardware
optimization. (This may be more specific cpu hardware optimization but at
least the library and cmake files are small.)

I decided as a teenager that assembler wasn't for me and haven't looked back,
but I happily take advantage of it when bundled well. So strong second for
the recommendation by Tomas to rely on this being done in an external and
tested library.

(Another interesting one there is highway [1]. Just packaging that would
likely be an excellent contribution.)

Dirk

[0] repo: https://github.com/google/crc32c
[1] repo: https://github.com/google/highway
docs: https://google.github.io/highway/en/master/


| 
| Op di 26 mrt 2024 om 15:41 schreef Dirk Eddelbuettel :
| >
| >
| > On 26 March 2024 at 10:53, jesse koops wrote:
| > | How can I make this portable and CRAN-acceptable?
| >
| > But writing (or borrowing ?) some hardware detection via either configure /
| > autoconf or cmake. This is no different than other tasks decided at 
install-time.
| >
| > Start with 'Writing R Extensions', as always, and work your way up from
| > there. And if memory serves there are already a few other packages with SIMD
| > at CRAN so you can also try to take advantage of the search for a 'token'
| > (here: 'SIMD') at the (unofficial) CRAN mirror at GitHub:
| >
| >https://github.com/search?q=org%3Acran%20SIMD&type=code
| >
| > Hth, Dirk
| >
| > --
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Check results on r-devel-windows claiming error but tests seem to pass?

2024-03-26 Thread Dirk Eddelbuettel


On 26 March 2024 at 09:37, Dirk Eddelbuettel wrote:
| 
| Avi,
| 
| That was a hickup and is now taken care of. When discussing this (off-line)
| with Jeroen we (rightly) suggested that keeping an eye on

Typo, as usual, "he (rightly) suggested".  My bad.

D.

| 
|https://contributor.r-project.org/svn-dashboard/
| 
| is one possibility to keep track while we have no status alert system from
| CRAN.  I too was quite confused because a new upload showed errors, and
| win-builder for r-devel just swallowed any uploads.
| 
| Cheers, Dirk
| 
| -- 
| dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] using portable simd instructions

2024-03-26 Thread Dirk Eddelbuettel


On 26 March 2024 at 10:53, jesse koops wrote:
| How can I make this portable and CRAN-acceptable?

But writing (or borrowing ?) some hardware detection via either configure /
autoconf or cmake. This is no different than other tasks decided at 
install-time.

Start with 'Writing R Extensions', as always, and work your way up from
there. And if memory serves there are already a few other packages with SIMD
at CRAN so you can also try to take advantage of the search for a 'token'
(here: 'SIMD') at the (unofficial) CRAN mirror at GitHub:

   https://github.com/search?q=org%3Acran%20SIMD&type=code

Hth, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Check results on r-devel-windows claiming error but tests seem to pass?

2024-03-26 Thread Dirk Eddelbuettel


Avi,

That was a hickup and is now taken care of. When discussing this (off-line)
with Jeroen we (rightly) suggested that keeping an eye on

   https://contributor.r-project.org/svn-dashboard/

is one possibility to keep track while we have no status alert system from
CRAN.  I too was quite confused because a new upload showed errors, and
win-builder for r-devel just swallowed any uploads.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to store large data to be used in an R package?

2024-03-26 Thread Dirk Eddelbuettel


On 25 March 2024 at 11:12, Jairo Hidalgo Migueles wrote:
| I'm reaching out to seek some guidance regarding the storage of relatively
| large data, ranging from 10-40 MB, intended for use within an R package.
| Specifically, this data consists of regression and random forest models
| crucial for making predictions within our R package.
| 
| Initially, I attempted to save these models as internal data within the
| package. While this approach maintains functionality, it has led to a
| package size exceeding 20 MB. I'm concerned that this would complicate
| submitting the package to CRAN in the future.
| 
| I would greatly appreciate any suggestions or insights you may have on
| alternative methods or best practices for efficiently storing and accessing
| this data within our R package.

Brooke and I wrote a paper on one way of addressing it via a 'data' package
accessibly via an Additional_repositories: entry supported by a drat repo.

See https://journal.r-project.org/archive/2017/RJ-2017-026/index.html for the
paper which contains a nice slow walkthrough of all the details.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Request for assistance: error in installing on Debian (undefined symbol: omp_get_num_procs) and note in checking the HTML versions (no command 'tidy' found, package 'V8' unavailable

2024-03-21 Thread Dirk Eddelbuettel


Salut Annaig,

On 21 March 2024 at 09:26, Annaig De-Walsche wrote:
| Dear R-package-devel Community,
| 
| I hope this email finds you well. I am reaching out to seek assistance 
regarding package development in R.
| 
| Specifically, I am currently developing an R package for querying composite 
hypotheses using Rccp. 

My preferred typo. The package is actually called Rcpp (pp as in plus-plus).
 
| Skipping checking HTML validation: no command 'tidy' found
| Skipping checking math rendering: package 'V8' unavailable
| 
| I have searched through the available documentation and resources, but I 
still need help understanding the error and note messages. Hence, I am turning 
to this community, hoping that some of you have encountered similar issues.
| 
| Thank you very much for considering my request. I would be grateful if anyone 
could provide me with some help.
| 
| Best regards,
| Annaïg De Walsche
| Quantitative Genetics and Evolution unit of INRAE
| Gif-sur-Yvette, France
| 

Could you share with us which actual Docker container you started?

| Installing package into ‘/home/docker/R’
| (as ‘lib’ is unspecified)
| 'getOption("repos")' replaces Bioconductor standard repositories, see
| 'help("repositories", package = "BiocManager")' for details.
| Replacement repositories:
| CRAN: https://cloud.r-project.org
| * installing *source* package ‘qch’ ...
| ** using staged installation
| ** libs
| using C++ compiler: ‘g++ (Debian 13.2.0-7) 13.2.0’
| using C++11
| g++ -fsanitize=undefined,bounds-strict -fno-omit-frame-pointer -std=gnu++11 
-I"/usr/local/lib/R/include" -DNDEBUG  -I'/home/docker/R/Rcpp/include' 
-I'/home/docker/R/RcppArmadillo/include' -I/usr/local/include-fpic  -g -O2 
-Wall -pedantic -mtune=native  -c RcppExports.cpp -o RcppExports.o
| g++ -fsanitize=undefined,bounds-strict -fno-omit-frame-pointer -std=gnu++11 
-I"/usr/local/lib/R/include" -DNDEBUG  -I'/home/docker/R/Rcpp/include' 
-I'/home/docker/R/RcppArmadillo/include' -I/usr/local/include-fpic  -g -O2 
-Wall -pedantic -mtune=native  -c updatePrior_rcpp.cpp -o updatePrior_rcpp.o
| updatePrior_rcpp.cpp:55: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|55 |#pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:65: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|65 |  #pragma omp for
|   |
| updatePrior_rcpp.cpp:92: warning: ignoring ‘#pragma omp critical’ 
[-Wunknown-pragmas]
|92 |  #pragma omp critical
|   |
| updatePrior_rcpp.cpp:178: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   178 |   #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:190: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   190 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:289: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   289 | #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:301: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   301 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:341: warning: ignoring ‘#pragma omp critical’ 
[-Wunknown-pragmas]
|   341 | #pragma omp critical
|   |
| updatePrior_rcpp.cpp:409: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   409 | #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:423: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   423 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:527: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   527 | #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:539: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   539 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:580: warning: ignoring ‘#pragma omp critical’ 
[-Wunknown-pragmas]
|   580 | #pragma omp critical
|   |

You seem to be using a number of OpenMP directives. That is good and
performant. But OpenMP cannot be assumed as given; some OSs more or less skip
it alltogether, some platforms or compilers may not have it. I ran into the
same issue earlier trying to test something with clang on Linux, it would not
find the OpenMP library gcc happily finds. I moved on in that (local) use case.

In short you probably want to condition your use.

| g++ -fsanitize=undefined,bounds-strict -fno-omit-frame-pointer -std=gnu++11 
-shared -L/usr/local/lib/R/lib -L/usr/local/lib -o qch.so RcppExports.o 
updatePrior_rcpp.o -L/usr/local/lib/R/lib -lRlapack -L/usr/local/lib/R/lib 
-lRblas -lgfortran -lm -lubsan -lquadmath -L/usr/local/lib/R/lib -lR
| installing to /home/docker/R/00LOCK-qch/00new/qch/libs
| ** R
| ** data
| *** moving datasets to lazyload DB
| ** byte-compile and prepare package for lazy loading
| 'getOption("repos")' replaces Bioconductor standard repositories, see
| 'help("repositories", package = "BiocManager")' for details.
| Replacement repositories:
| CRAN: https://cloud.r-project.org
| Note: wrong nu

Re: [R-pkg-devel] new maintainer for CRAN package XML

2024-03-19 Thread Dirk Eddelbuettel


Dear Uwe,

Did CRAN ever reach a decision here with a suitable volunteer (or group of
volunteers) ?  The state of XML came up again recently on mastodon, and it
might be helpful to share an update if there is one.

Thanks, as always, for all you and the rest of the team do for CRAN.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 15:12, Duncan Murdoch wrote:
| On 05/03/2024 2:26 p.m., Dirk Eddelbuettel wrote:
| > The default behaviour is to build after every commit to the main branch.  
But
| > there are options. On the repo I mentioned we use
| > 
| >  "branch": "*release",
| 
| Where do you put that?  I don't see r2u on R-universe, so I guess you're 
| talking about a different repo; which one?

In the (optional) control repo that can drive your 'r-universe', and the file
has to be named 'packages.json'. For you the repo would

https://github.com/dmurdoch/dmurdoch.r-universe.dev

(and the naming rule was tightened by Jeroen recently -- we used to call
these just 'universe', now it has to match your runiverse)

The file packages.json would then have a block

  {
"package": "rgl",
"maintainer": "Duncan Murdoch "
"url": "https://github.com/dmurdoch/rgl";,
"available": true,
"branch": "*release"
  }

The reference I mentioned is our package 'tiledbsoma' (joint work of TileDB
and CZI, in https://github.com/single-cell-data/TileDB-SOMA) and described here:

https://github.com/TileDB-Inc/tiledb-inc.r-universe.dev/blob/master/packages.json
 

(and you can ignore the '"subdir": "apis/r"' which is a facet local to that 
repo).

Note that 'my' packages.json in my eddelbuettel.r-universe.dev ie

https://github.com/eddelbuettel/eddelbuettel.r-universe.dev/blob/master/packages.json

also describe but without the '"branch": "*release"' and that builds with every 
merge to
the main branch by my choice; that build is mine and 'inofficial' giving us two.

| > It is under your control. You could document how to install via `remotes`
| > from that branch.  As so often, it's about trading one thing off for 
another.
| 
| I do that, but my documentation falls off the bottom of the screen, and 
| the automatic docs generated by R-universe are at the top.

I always get lost in the r-universe docs too. Some, as Jeroen kindly reminded
me the other day, are here:  https://github.com/r-universe-org

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 13:28, Duncan Murdoch wrote:
| What I'm seeing is that the tags are ignored, and it is distributing the 
| HEAD of the main branch.  I don't think most users should be using that 
| version:  in my packages it won't have had full reverse dependency 
| checks, I only do that before CRAN releases.  And occasionally it hasn't 
| even passed R CMD check, though that's not my normal workflow.  On the 
| other hand, I like that it's available and easy to install, it just 
| shouldn't be the default install.

The default behaviour is to build after every commit to the main branch.  But
there are options. On the repo I mentioned we use

"branch": "*release",

and now builds occur on tagged releases only. The above is AFAIUI a meta
declaration understood by `remotes`, it was an option suggested by a
colleague.  Naming actual branches also works.
 
| I suppose I could do all development on a "devel" branch, and only merge 
| it into the main branch after I wanted to make a release, but then the 
| R-universe instructions would be no good for getting the devel code.

It is under your control. You could document how to install via `remotes`
from that branch.  As so often, it's about trading one thing off for another.

| I don't know anything about dpkg, but having some options available to 
| package authors would be a good thing.

Yes but you know {install,available}.packages and have some understanding of
how R identifies and installs packages. I merely illustrated a different use
pattern of giving "weights" to repos. If "we all" want different behaviour,
someone has to site down and write it. Discussing some possible specs and
desired behavior may help. 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 11:56, Duncan Murdoch wrote:
| I have mixed feelings about r-universe.  On the one hand, it is really 
| nicely put together, and it offers the service described above.  On the 
| other, it's probably a bad idea to follow its advice and use 
| install.packages() with `repos` as shown:  that will install development 
| versions of packages, not releases.

Yup. It's a point I raised right at the start as I really do believe in
curated releases but clearly a lot of people prefer the simplicity of
'tagging a release' at GitHub and then getting a build.

r-universe is indeed good at what it does and reliable. There are limited
choices in 'driving' what you can do with it.  We rely quite heavily on it in
a large project for work.  As each 'repo' can appear only once in a universe,
we resorted to having the 'offical' build follow GitHub 'releases', as well
as (optional, additional) builds against a the main branch from another
universe.  This example is for a non-CRAN package.

With CRAN packages, r-universe can be useful too. For some of my packages, I
now show multiple 'badges' at the README: for the released CRAN version as
well as for the current 'rc' in the main branch sporting a differentiating
final digit.  RcppArmadillo had a pre-releases available to test that way for
a few weeks til the new release this week.  So in effect, this gives you what
`drat` allows yet also automagically adds builds. It's quite useful when you
are careful about it.
 
| Do you know if it's possible for a package to suggest the CRAN version 
| first, with an option like the above only offered as a pre-release option?

In the language of Debian and its dpkg and tools, one solution to that would
be 'repository pinning' to declare a 'value' on a repository.  There, the
default is 500, and e.g. for r2u I set this to 700 as you usually want its
versions.

We do not have this for R, but it could be added (eventually) as a new value
in PACKAGES, or as a new supplementary attribute.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 06:25, Duncan Murdoch wrote:
| You could make a compatible version of `survivalmodels` available on a 
| non-CRAN website, and refer to that website in the 
| Additional_repositories field of DESCRIPTION.

Every r-universe sub-site fits that requirement. For this package Google's
first hit was https://raphaels1.r-universe.dev/survivalmodels and it carries
the same line on install.packages() that Jeroen adds to every page:

 install.packages('survivalmodels', repos = 
c('https://raphaels1.r-universe.dev',
  'https://cloud.r-project.org'))

So doing all three of 
- adding a line 'Additional_repositories: https://raphaels1.r-universe.dev'
- adding a 'Suggests: survivalmodels;
- ensuring conditional use only as Suggests != Depends
should do.

| It would be best if you fixed whatever issue caused survivalmodels to be 
| archived when you do this.
| 
| Looking here: 
| 
https://cran-archive.r-project.org/web/checks/2024/2024-03-02_check_results_survivalmodels.html
| that appears very easy to do.  The source is here: 
| https://github.com/RaphaelS1/survivalmodels/ .

The other may even take a PR fixing this going forward.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Unable to access log operator in C

2024-02-28 Thread Dirk Eddelbuettel


On 28 February 2024 at 19:05, Avraham Adler wrote:
| I am hoping the solution to this question is simple, but I have not
| been able to find one. I am building a routine in C to be called from
| R. I am including Rmath.h. However, when I have a call to "log", I get
| the error "called object 'log' is not a function or a function
| pointer. When I "trick" it by calling log1p(x - 1), which I *know* is
| exported from Rmath.h, it works.
| 
| More completely, my includes are:
| #include 
| #include 
| #include 
| #include 
| #include  // for NULL
| #include 
| 
| The object being logged is a double, passed into C as an SEXP, call it
| "a", which for now will always be a singleton. I initialize a pointer
| double *pa = REAL(a). I eventually call log(pa[0]), which does not
| compile and throws the error listed above. Switching the call to
| log1p(pa[0] - 1.0) works and returns the proper answer.
| 
| Even including math.h explicitly does not help, which makes sense as
| it is included by Rmath.h.

Can you show the actual line?  Worst case rename your source file to end in
.cpp, include  and call std::log.

  > Rcpp::cppFunction("double mylog(double x) { return std::log(x); }")
  > mylog(exp(42))
  [1] 42
  > 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available: ‘arrow’

2024-02-25 Thread Dirk Eddelbuettel


On 26 February 2024 at 09:19, Simon Urbanek wrote:
| [requiring increased is] best way [..] and certainly the only good practice.

No, not really. Another viewpoint, which is implemented in another project I
contribute to, is where a version + build_revision tuple exists if, and only
if, the underlying upload was accepted. Until then upload iterations are fine.

Hence s/only good practive/one possible way/.

Anyway: `arrow` is long back at CRAN (yay!) so this thread is done anyway.

Dirk
 
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available: ‘arrow’

2024-02-23 Thread Dirk Eddelbuettel


On 23 February 2024 at 15:53, Leo Mada wrote:
| Dear Dirk & R-Members,
| 
| It seems that the version number is not incremented:
| # Archived
| arrow_14.0.2.1.tar.gz   2024-02-08 11:57  3.9M
| # Pending
| arrow_14.0.2.1.tar.gz   2024-02-08 18:24  3.9M
| 
| Maybe this is the reason why it got stuck in "pending".

No it is not.

The hint to increase version numbers on re-submission is a weaker 'should' or
'might', not a strong 'must'.

I have uploaded a few packages to CRAN over the last two decades, and like
others have made mistakes requiring iterations. I have not once increased a
version number.  If/when CRAN sees an error in its (automated, largely)
processing, the package is moved and the space is cleared allowing a fresh
upload. (Of course you cannot upload under the same filename twice _before_
the initial processing. By default uploads do not overwrite.)  Arhive/ is
distinct from pending.  POSIX semantics on times also help: your example
clearly shows that the one in archived is older by about 6 1/2 hours. 

That said, in case there are multiple rounds of email and discussion having
distinct numbers may ease identification of the particular package and
discussion thread. But it still makes sense to have this be a suggestion, not
a requirement.
 
Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available:‘arrow’

2024-02-22 Thread Dirk Eddelbuettel


On 22 February 2024 at 04:01, Duncan Murdoch wrote:
| For you to deal with this, you should make arrow into a suggested 
| package,

For what it is worth, that is exactly what package tiledb does.

Yet the Suggests: still lead to a NOTE requiring a human to override which
did not happen until I gently nudged after the 'five work days' had lapsed.

So full agreement that 'in theory' a Suggests: should help and is the weaker
and simpler dependency.  However 'in practice' it can still lead to being
held up up when the weak-dependency package does not build.

[ As for Dénes's point, most if not all the internals in package tiledb are
  actually on nanoarrow but we offer one code path returning an Arrow Table
  object and that requires 'arrow' the package for the instantiation.

  So it really all boils down to 'Lightweight is the right weight' as we say
  over at www.tinyverse.org.  But given that the public API offers an Arrow
  accessor, it is a little late to pull back from it.  And Arrow is a powerful
  and useful tool. Building it, however, can have its issues... ]

Anyway, while poking around the issue when waiting, I was also told by Arrow
developers that the issue (AFAICT a missing header) is fixed, and looking at
CRAN's incoming reveals the package has been sitting there since Feb 8 (see
https://cran.r-project.org/incoming/pending/).  So would be good to hear from
CRAN what if anything is happening here.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN uses an old version of clang

2024-02-09 Thread Dirk Eddelbuettel


On 9 February 2024 at 08:59, Marcin Jurek wrote:
| I recently submitted an update to my package. It previous version relied on
| Boost for Bessel and gamma functions but a colleague pointed out to me that
| they are included in the standard library beginning with the C++17
| standard.

There is an often overlooked bit of 'fine print': _compiler support_ for a
C++ standard is not the same as the _compiler shipping a complete library_
for that same standard. This can be frustrating. See the release notes for
gcc/g++ and clang/clang++, IIRC they usually have a separate entry for C++
library support.

In this case, can probably rely on LinkingTo: BH which has been helping with
Boost headers for over a decade.

Writing R Extensions is also generally careful in reminding us that such
language standard support is always dependent on the compiler at hand. So
package authors ought to check, just like R does via its extensive configure
script when it builds.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] failing CRAN checks due to problems with dependencies

2024-02-08 Thread Dirk Eddelbuettel


On 8 February 2024 at 13:28, Marcin Jurek wrote:
| Ok, this makes sense! I saw that Rcpp was failing the checks too but I
| wasn't sure if I should resubmit or wait. Thanks!

For completeness, it was not caused by Rcpp but rather by a mix on new clang
and gcc versions which somehow got into each other's way on that machine; and
this has by now been fixed.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] r-oldrel-linux- not in CRAN checks?

2024-02-07 Thread Dirk Eddelbuettel


On 7 February 2024 at 09:15, Vincent van Hees wrote:
| Thanks Ivan, In that case I will conclude that it is time to upgrade my
| Ubuntu 18 machine. I just wasn't sure whether there is still a need for
| keeping my own package Ubuntu 18 compatible, but if dependencies like Rfast
| do not do it and if it is even not in the CRAN checks anymore then there is
| also limited value in me making the effort.

I think that is a good conclusion.  A few more observations:

- for #r2u I package / build all of CRAN for Ubuntu (both 20.04 and 22.04),
  there are a handful of CRAN packages I cannot build on Ubuntu 20.04 (!!!)
  because they use C++20 which the default compiler on 20.04 does not support

- in my dayjob (also behind one large CRAN package I maintain) we had to move
  all CI jobs from 20.04 to 22.04 for the same reason. I think this will me
  more, not less, common.

- your premise in your initial email was not quite supported by either
  "Writing R Extensions" nor the "CRAN Repository Policy": Neither stipulates
  a minimum 'old' environment.

FWIW I am a fairly happy camper with 22.04 for deployment, 23.10 for my use
and will likely move to 22.04 fairly soon this summer after it is released.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Bioconductor reverse dependency checks for a CRAN package

2024-01-30 Thread Dirk Eddelbuettel


Ivan,

On 30 January 2024 at 18:56, Ivan Krylov via R-package-devel wrote:
| Hello R-package-devel,
| 
| What would you recommend in order to run reverse dependency checks for
| a package with 182 direct strong dependencies from CRAN and 66 from
| Bioconductor (plus 3 more from annotations and experiments)?
| 
| Without extra environment variables, R CMD check requires the Suggested
| packages to be available, which means installing...
| 
| revdepdep <- package_dependencies(revdep, which = 'most')
| revdeprest <- package_dependencies(
|  unique(unlist(revdepdep)),
|  which = 'strong', recursive = TRUE
| )
| length(setdiff(
|  unlist(c(revdepdep, revdeprest)),
|  unlist(standard_package_names())
| ))
| 
| ...up to 1316 packages. 7 of these suggested packages aren't on CRAN or
| Bioconductor (because they've been archived or have always lived on
| GitHub), but even if I filter those out, it's not easy. Some of the
| Bioconductor dependencies are large; I now have multiple gigabytes of
| genome fragments and mass spectra, but also a 500-megabyte arrow.so in
| my library. As long as a data package declares a dependency on your
| package, it still has to be installed and checked, right?
| 
| Manually installing the SystemRequirements is no fun at all, so I've
| tried the rocker/r2u container. It got me most of the way there, but
| there were a few remaining packages with newer versions on CRAN. For

If that happens, please file an issue ticket at the r2u site.  CRAN should be
current as I update business daily whenever p3m does and hence will be as
current as approaches using it (and encode the genuine system dependencies).

BioConductor in r2u is both more manual (and I try to update "every few
days") and course not complete so if you miss a package _from BioConductor_
again please just file an issue ticket.

| these, I had to install the system packages manually in order to build
| them from source.

For what it is worth, my own go-to for many years has been a VM in which I
install 'all packages needed' for the rev.dep to be checked. Doing it with
on-demands 'lambda function (one per package tested)' based on r2u would be a
nice alternative but I don't have the aws credits to try it...

| Someone told me to try the rocker/r-base container together with pak.
| It was more proactive at telling me about dependency conflicts and
| would have got me most of the way there too, except it somehow got me a
| 'stringi' binary without the corresponding libicu*.so*, which stopped
| the installation process. Again, nothing that a bit of manual work
| wouldn't fix, but I don't feel comfortable setting this up on a CI
| system. (Not on every commit, of course - that would be extremely
| wasteful - but it would be nice if it was possible to run these checks
| before release on a different computer and spot more problems this way.)
| 
| I can't help but notice that neither install.packages() nor pak() is
| the recommended way to install Bioconductor packages. Could that
| introduce additional problems with checking the reverse dependencies?

As Martin already told you, BioConductor has always had their own
installation wrapper because they are a 'little different' with the bi-annual
release cycle.
 
| Then there's the check_packages_in_dir() function itself. Its behaviour
| about the reverse dependencies is not very helpful: they are removed
| altogether or at least moved away. Something may be wrong with my CRAN
| mirror, because some of the downloaded reverse dependencies come out
| with a size of zero and subsequently fail the check very quickly.
| 
| I am thinking of keeping a separate persistent library with all the
| 1316 dependencies required to check the reverse dependencies and a

As stated above, that is what works for me. I used to use a chroot directory
on a 'big server', now I use a small somewhat underpowered VM.

| persistent directory with the reverse dependencies themselves. Instead
| of using the reverse=... argument, I'm thinking of using the following
| scheme:
| 
| 1. Use package_dependencies() to determine the list of packages to test.
| 2. Use download.packages() to download the latest version of everything
| if it doesn't already exist. Retry if got zero-sized or otherwise
| damaged tarballs. Remove old versions of packages if a newer version
| exists.
| 3. Run check_packages_in_dir() on the whole directory with the
| downloaded reverse dependencies.
| 
| For this to work, I need a way to run step (3) twice, ensuring that one
| of the runs is performed with the CRAN version of the package in the
| library and the other one is performed with the to-be-released version
| of the package in the library. Has anyone already come up with an
| automated way to do that?
| 
| No wonder nobody wants to maintain the XML package.

Well a few of us maintain packages with quite a tail and cope. Rcpp has 2700,
RcppArmadillo have over 100, BH a few hundred. These aren't 'light'. I wrote
myself the `prrd` package (on 

Re: [R-pkg-devel] lost braces note on CRAN pretest related to \itemize

2024-01-23 Thread Dirk Eddelbuettel


On 23 January 2024 at 19:39, Patrick Giraudoux wrote:
| Has anyone an idea about what is going wrong ?

\item has no braces following it.  From a package I submitted today and for
which I still have NEWS.Rd in the editor (indented here):

  \section{Changes in version 0.0.22 (2024-01-23)}{
\itemize{
  \item Replace empty examples macros to satisfy CRAN request.
}
  }

Hth,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] New Package Removal because Shared Library Too Large from Debugging Symbols

2024-01-20 Thread Dirk Eddelbuettel


Johann,

On 20 January 2024 at 14:38, Johann Gaebler wrote:
| Hi everyone,
| 
| I received the following message regarding  `rar` 
, a package that I put up on CRAN two 
days ago:
| 
| > Dear maintainer,
| > 
| > Please see the problems shown on
| > .
| > 
| > Please correct before 2024-02-02 to safely retain your package on CRAN.
| 
| The issue is that the compiled libraries are too large. The Mac CRAN checks 
turned up the following note:

No you read that wrong. That is NOT the issue. That is a mere 'Note'.

Your issue is the bright red link labeled 'LTO' on that pagem and going to

  https://www.stats.ox.ac.uk/pub/bdr/LTO/rar.out
  
where it details an error on that platform / compilation choice:

  g++-13 -std=gnu++17 -shared -L/usr/local/gcc13/lib64 -L/usr/local/lib64 -o 
rar.so cpp11.o distpt.o iter.o max.o min.o regdata.o sens.o test-distpt.o 
test-iter.o test-max.o test-min.o test-regdata.o test-runner.o test-sens.o
  cpp11.cpp:18:13: warning: 'run_testthat_tests' violates the C++ One 
Definition Rule [-Wodr]
 18 | extern SEXP run_testthat_tests(void *);
| ^
  /data/gannet/ripley/R/test-dev/testthat/include/testthat/testthat.h:172:17: 
note: 'run_testthat_tests' was previously declared here
172 | extern "C" SEXP run_testthat_tests(SEXP use_xml_sxp) {
| ^
  make[2]: Leaving directory '/data/gannet/ripley/R/packages/tests-LTO/rar/src'
  installing to 
/data/gannet/ripley/R/packages/tests-LTO/Libs/rar-lib/00LOCK-rar/00new/rar/libs

This 'violates the C++ One Definition Rule' is something that started with
g++-13, if memory serves. Without looking at the code, I think you did
something that lead to a symbol being included multiple times, and it should
not be.

Cheers, Dirk

 
| > installed size is  8.9Mb
| > sub-directories of 1Mb or more:
| >  libs   8.7Mb
| 
| I have not been able to reproduce the issue either locally or on any machine 
I have ready access to. I have built it on some of the Rhub and R-Project build 
systems, and the same issue (with very different `libs` sizes) came up on some 
of them:
| 
| • (RHub) Ubuntu Linux 20.04.1 LTS, R-release, GCC: 18.2Mb,
| • (RHub) Fedora Linux, R-devel, clang, gfortran: 6.8Mb,
| • (R-Project) r-release-macosx-arm64: 8.5Mb.
| 
| Based on trying to read up about this, it seems that this is a pretty common 
problem 
 
for compiled packages because of debugging symbols getting inserted into the 
shared library file. Using the fix from that blog post where you modify the 
Makevars to strip debugging symbols from the shared library seems to solve the 
issue on those build systems, so I feel reasonably confident that this is 
what’s going on.
| 
| Apparently many, many existing packages on CRAN have the same issue. However, 
I’m very new to R package development, so I’m not exactly sure what to do. I 
have two questions:
| 
| 1. Is there anything I need to “fix” here, or should I just make contact with 
the CRAN folks and bring the fact that this is being caused by debugging 
symbols to their attention?
| 2. Regardless of whether or not I have to fix this issue for CRAN, is there a 
way to strip out the debugging symbols that comports with CRAN policies? The 
method suggested in the blog post above (adding a phony target in `Makevars` 
that strips the shared library) seems not to be CRAN-compliant, but I could be 
mistaken about that. (In particular, I had to modify it locally to get it to 
run, so I’m not sure what the platform-independent version of it looks like.)
| 
| Thanks in advance for the help!
| 
| Sincerely,
| Johann D. Gaebler
|   [[alternative HTML version deleted]]
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] current docker image for ASAN

2024-01-16 Thread Dirk Eddelbuettel


On 16 January 2024 at 15:54, Steven Scott wrote:
| Greetings everyone, though I expect this message is mainly for Dirk.
| 
| CRAN checks of my bsts/Boom package generate an ASAN error that the CRAN
| maintainers have asked me to look into.  I recall doing this before (this
| error has been there for several years now) via a docker image that Dirk
| had set up.  I have two questions.
| 
| 1) Which docker image should I use?  I imagine it has been updated since
| the last time I tried.
| 2) Is the image built with an asan-appropriate libc++?  I'm asking because
| the last time I tried tracking down this error, ASAN identified that there
| was an error, but pointed to an irrelevant section of code.  Brian thinks
| libc++ is the culprit.

Thanks -- maybe see prior messages. The container is on a scheduled weekly
rebuild, but for reasons that long escape me I also switched at some point to
relying on the 'sumo' container Winston builds by collating several such
containers and have used this myself for several debugging trips:

   https://github.com/wch/r-debug

You want RDsan and/or RDcsan therein.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] checking CRAN incoming feasibility

2024-01-16 Thread Dirk Eddelbuettel


On 17 January 2024 at 09:42, Simon Urbanek wrote:
| that check always hangs for me (I don't think it likes NZ ;)), so I just use
| 
| _R_CHECK_CRAN_INCOMING_REMOTE_=0 R CMD check --as-cran ...

You can also set it in Renviron files consulted just for checks:

  $ grep INCOMING_= ~/.R/check.Renviron*
  /home/edd/.R/check.Renviron:_R_CHECK_CRAN_INCOMING_=FALSE
  /home/edd/.R/check.Renviron-Rdevel:_R_CHECK_CRAN_INCOMING_=TRUE
  $ 

Best, Dirk

| 
| Cheers,
| Simon
| 
| 
| > On Jan 16, 2024, at 6:49 PM, Rolf Turner  wrote:
| > 
| > 
| > On Tue, 16 Jan 2024 16:24:59 +1100
| > Hugh Parsonage  wrote:
| > 
| >>> Surely the software just has to check
| >> that there is web connection to a CRAN mirror.
| >> 
| >> Nope! The full code is in tools:::.check_package_CRAN_incoming  (the
| >> body of which filled up my entire console), but to name a few checks
| >> it has to do: check that the name of the package is not the same as
| >> any other, including archived packages (which means that it has to
| >> download the package metadata), make sure the licence is ok, see if
| >> the version number is ok. 10 minutes is quite a lot though. I suspect
| >> the initial connection may have been faulty.
| > 
| > Well, it may not have been 10 minutes, but it was at least 5.  The
| > problem is persistent/repeatable.  I don't believe that there is any
| > faulty connection.
| > 
| > Thanks for the insight.
| > 
| > cheers,
| > 
| > Rolf Turner
| > 
| > -- 
| > Honorary Research Fellow
| > Department of Statistics
| > University of Auckland
| > Stats. Dep't. (secretaries) phone:
| > +64-9-373-7599 ext. 89622
| > Home phone: +64-9-480-4619
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| >
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] test failure: oldrel

2024-01-16 Thread Dirk Eddelbuettel


On 16 January 2024 at 10:28, Josiah Parry wrote:
| Oddly making the change has made CI happy. 
| 
https://github.com/R-ArcGIS/arcgisutils/actions/runs/7543315551/job/20534063601
| 
| It may be that the issue was OS related but I'm unsure since only oldrel for
| windows and macos check results are published https://cran.r-project.org/web/
| checks/check_results_arcgisutils.html

Seb solved the puzzle (in direct email to me). It has to do with the fact
that _the container_ defaults to UTC.  If I add '-e TZ=America/Chicago' to
the invocation we do indeed see a difference between r-release and r-oldrel
(and I also brought the version string display inside R):

edd@rob:~$ for v in 4.3.2 4.2.2; do docker run --rm -ti -e TZ=America/Chicago 
r-base:${v} Rscript -e 'cat(format(getRversion()), 
format(as.POSIXct(Sys.Date(), tz = "UTC")), Sys.getenv("TZ"), "\n")'; done
4.3.2 2024-01-16 America/Chicago 
4.2.2 2024-01-15 18:00:00 America/Chicago 
edd@rob:~$ 

Thanks to Seb for the cluebat wave.

Dirk

| 
| 
| On Tue, Jan 16, 2024 at 9:59 AM Dirk Eddelbuettel  wrote:
| 
| 
| Doesn't seem to be the case as it moderately easy to check (especially 
when
| you happen to have local images of r-base around anyway):
| 
| edd@rob:~$ for v in 4.3.2 4.2.2 4.1.3 4.0.5 3.6.3 3.5.3 3.4.4 3.3.3; do
| echo -n "R ${v}: "; docker run --rm -ti r-base:${v} Rscript -e 'as.POSIXct
| (Sys.Date(), tz = "UTC")'; done
| R 4.3.2: [1] "2024-01-16 UTC"
| R 4.2.2: [1] "2024-01-16 UTC"
| R 4.1.3: [1] "2024-01-16 UTC"
| R 4.0.5: [1] "2024-01-16 UTC"
| R 3.6.3: [1] "2024-01-16 UTC"
| R 3.5.3: [1] "2024-01-16 UTC"
| R 3.4.4: [1] "2024-01-16 UTC"
| R 3.3.3: [1] "2024-01-16 UTC"
| edd@rob:~$
| 
| Dirk
| 
| --
| dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] test failure: oldrel

2024-01-16 Thread Dirk Eddelbuettel


Doesn't seem to be the case as it moderately easy to check (especially when
you happen to have local images of r-base around anyway):

edd@rob:~$ for v in 4.3.2 4.2.2 4.1.3 4.0.5 3.6.3 3.5.3 3.4.4 3.3.3; do echo -n 
"R ${v}: "; docker run --rm -ti r-base:${v} Rscript -e 'as.POSIXct(Sys.Date(), 
tz = "UTC")'; done
R 4.3.2: [1] "2024-01-16 UTC"
R 4.2.2: [1] "2024-01-16 UTC"
R 4.1.3: [1] "2024-01-16 UTC"
R 4.0.5: [1] "2024-01-16 UTC"
R 3.6.3: [1] "2024-01-16 UTC"
R 3.5.3: [1] "2024-01-16 UTC"
R 3.4.4: [1] "2024-01-16 UTC"
R 3.3.3: [1] "2024-01-16 UTC"
edd@rob:~$ 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggests with non-CRAN packages

2024-01-10 Thread Dirk Eddelbuettel


On 10 January 2024 at 16:25, Uwe Ligges wrote:
| 
| 
| On 10.01.2024 15:35, Josiah Parry wrote:
| > Thanks, all. As it goes, the package submission failed. The package that 
| > is suggested is available at https://r.esri.com/bin/ 
| >  and as such provided `https://r.esri.com 
| > ` as the url in `Additional_repositories`.
| 
| There is no
| 
| https://r.esri.com/src
| 
| hence it is obviously not a standard repository.

And how to set one up is described very patiently over ten pages in

   Hosting Data Packages via drat: A Case Study with Hurricane Exposure Data

at

   https://journal.r-project.org/archive/2017/RJ-2017-026/index.html

which does

   Abstract Data-only packages offer a way to provide extended functionality
   for other R users. However, such packages can be large enough to exceed
   the package size limit (5 megabytes) for the Comprehen sive R Archive
   Network (CRAN). As an alternative, large data packages can be posted to
   additional repostiories beyond CRAN itself in a way that allows smaller
   code packages on CRAN to access and use the data. The drat package
   facilitates creation and use of such alternative repositories and makes it
   particularly simple to host them via GitHub. CRAN packages can draw on
   packages posted to drat repositories through the use of the
   ‘Additonal_repositories’ field in the DESCRIPTION file. This paper
   describes how R users can create a suite of coordinated packages, in which
   larger data packages are hosted in an alternative repository created with
   drat, while a smaller code package that interacts with this data is
   created that can be submitted to CRAN.

for the use case of a 'too large for CRAN' suggested data package

| > The request was to remove the additional repositories and provide 
| > instructions for package installation in the Description field. This 
| > package, arcgisbinding, is used in one line of the entire package 
| > 
https://github.com/R-ArcGIS/arcgisutils/blob/64093dc1a42fa28010cd45bb6ae8b8c57835cb40/R/arc-auth.R#L123
 

 to extract an authorization token. It is provided for compatibility with a 
semi-closed-source R package. The installation instructions for which 
arelengthy 
(https://r.esri.com/r-bridge-site/arcgisbinding/installing-arcgisbinding.html 
) 
and /only /available as a windows binary. Providing an explicit call out for 
installation in the "Description" field of the DESCRIPTION feels like it is 
co-opting the Description to describe the installation process for a function 
that I anticipate /very few /people to use.
| 
| So you can either remove the need for that package or say something like 
| " and if an authorization token is to be extracted on Windows, the 
| 'arcgisbinding' package is needed that can be installed as explained at 
| ."

Additional_repositories is great, and you have 134 examples at CRAN:

> D <- data.table(tools::CRAN_package_db())
> D[is.na(Additional_repositories)==FALSE, .(Package, Additional_repositories)]
  Package
Additional_repositories
  
  
  1:archiDART  
https://archidart.github.io/drat/
  2:   aroma.core 
https://henrikbengtsson.r-universe.dev,\nhttps://r-forge.r-project.org
  3: asteRisk   
https://rafael-ayala.github.io/drat/
  4:BayesfMRI 
https://inla.r-inla-download.org/R/testing
  5:bigDM  
https://inla.r-inla-download.org/R/stable
 ---

130:TreatmentPatterns   
https://ohdsi.github.io/drat
131: TreeDist  
https://ms609.github.io/packages/
132: triplesmatch
https://errickson.net/rrelaxiv/
133: USA.state.boundaries 
https://iembry.gitlab.io/drat/
134:  voi 
https://inla.r-inla-download.org/R/stable/
>

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] portability question

2023-12-20 Thread Dirk Eddelbuettel


The point of my email was that

   if [ `uname -s` = 'Darwin' ]; then ...

allows for a clean branch between the (new here) macOS behaviour and (old,
prior) behavior removing all concerns about 'portability' (per the Subject:).

You missed 100% of that.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] portability question

2023-12-20 Thread Dirk Eddelbuettel


On 20 December 2023 at 11:10, Steven Scott wrote:
| The Boom package builds a library against which other packages link.  The
| library is built using the Makevars mechanism using the line
| 
| ${AR} rc $@ $^
| 
| A user has asked me to change 'rc' to 'rcs' so that 'ranlib' will be run on
| the archive.  This is apparently needed for certain flavors of macs.  I'm
| hoping someone on this list can comment on the portability of that change
| and whether it would negatively affect other platforms.  Thank you.

Just branch for macOS.  Here is a line I 'borrowed' years ago from data.table
and still use for packages needed to call install_name_tool on macOS.  You
could have a simple 'true' branch of the test use 'rcs' and the 'false'
branch do what you have always done.  Without any portability concerns.

>From https://github.com/Rdatatable/data.table/blob/master/src/Makevars.in#L14
and indented here for clarity

if [ "$(OS)" != "Windows_NT" ] && [ `uname -s` = 'Darwin' ]; then \
   install_name_tool -id data_table$(SHLIB_EXT) data_table$(SHLIB_EXT); 
\
fi

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN submission struggle

2023-12-16 Thread Dirk Eddelbuettel


Christiaan,

You say "errors" but you don't say which.

You say you have a package, but don't provide a source reference.

This makes it awfully hard to say or do anything. In case you are on github
or gitlab or ... it would simply be easiest to share a reference to the
repository.  Emailing 10mb blobs to every list subscriber is not ideal.

Dirk

PS Fortunes has that covered too

   > fortunes::fortune("mind read")

   There are actual error messages, and until you show them, we can not help as 
the mind
   reading machine is currently off for repairs.
  -- Dirk Eddelbuettel (after reports about errors with R CMD check)
 R-help (July 2010)

   > 
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Wrong mailing list: Could the 100 byte path length limit be lifted?

2023-12-13 Thread Dirk Eddelbuettel


On 13 December 2023 at 16:02, Tomas Kalibera wrote:
| 
| On 12/13/23 15:59, Dirk Eddelbuettel wrote:
| > On 13 December 2023 at 15:32, Tomas Kalibera wrote:
| > | Please don't forget about what has been correctly mentioned on this
| > | thread already: there is essentially a 260 character limit on Windows
| > | (see
| > | 
https://blog.r-project.org/2023/03/07/path-length-limit-on-windows/index.html
| > | for more). Even if the relative path length limit for a CRAN package was
| > | no longer regarded important for tar compatibility, it would still make
| > | sense for compatibility with Windows. It may still be a good service to
| > | your users if you keep renaming the files to fit into that limit.
| >
| > So can lift the limit from 100 char to 260 char ?
| 
| The 260 char limit is for the full path. A package would be extracted in 
| some directory, possibly also with a rather long name.

Call a cutoff number. 

Any move from '100' to '100 + N' for any nonzero N is a win. Pick one, and
then commit the change.  N = 50 would be a great start as arbitrary as it is.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Wrong mailing list: Could the 100 byte path length limit be lifted?

2023-12-13 Thread Dirk Eddelbuettel


On 13 December 2023 at 15:32, Tomas Kalibera wrote:
| Please don't forget about what has been correctly mentioned on this 
| thread already: there is essentially a 260 character limit on Windows 
| (see 
| https://blog.r-project.org/2023/03/07/path-length-limit-on-windows/index.html 
| for more). Even if the relative path length limit for a CRAN package was 
| no longer regarded important for tar compatibility, it would still make 
| sense for compatibility with Windows. It may still be a good service to 
| your users if you keep renaming the files to fit into that limit.

So can lift the limit from 100 char to 260 char ?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Dirk Eddelbuettel


PS One aspect I didn't mention clearly (my bad) that this does not affect all
or even most packages: in most cases the src/Makevars should indeed be as
simple as possible. But in _some_ cases we need to cooperate with external
libraries and in some of these cases the switch has been seen to be necessary.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Dirk Eddelbuettel


On 10 December 2023 at 17:07, Simon Urbanek wrote:
| As discussed here before packages should *never* set -mmacosx-version-min
| or similar flags by hand.

a) That is in conflict with what was said in the past; we have used an
explicit min version of 10.14 for the C++17 we were using then (and we now
need a bit more so 11.0 is welcome).

b) That is in conflict with how I read the R manual I quoted: R Admin,
Section 'C.3.10 Building binary packages'. Recall that our package uses a
binary artifact (by permission) so we have to play along and this minimum
version used by both matters.

c) That also appears to be in conflict with empirics. A quick search [1] at
the searchable CRAN mirror finds around a dozen packages doing just that:
setting a minimum version.

Anyway -- I 'eventually' got the info I need. 

Best regards, Dirk


[1] https://github.com/search?q=org%3Acran%20mmacosx-version-min&type=code

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Dirk Eddelbuettel


Last month, I had asked about the setting '-mmacosx-version-min' here.  The
setting can be used to specify what macOS version one builds for. It is,
oddly enough, not mentioned in Writing R Extension but for both r-release and
r-devel the R Administration manual states

   • Current CRAN macOS distributions are targeted at Big Sur so it is
 wise to ensure that the compilers generate code that will run on
 Big Sur or later.  With the recommended compilers we can use
  CC="clang -mmacosx-version-min=11.0"
  CXX="clang++ -mmacosx-version-min=11.0"
  FC="/opt//gfortran/bin/gfortran -mmacosx-version-min=11.0"
 or set the environment variable
  export MACOSX_DEPLOYMENT_TARGET=11.0

which is clear enough. (There is also an example in the R Internals manual
still showing the old (and deprecated ?) value of 10.13.)  It is also stated
at the top of mac.r-project.org.  But it is still in a somewhat confusing
contradiction to the matrix of tests machines, described e.g. at

   https://cran.r-project.org/web/checks/check_flavors.html

which still has r-oldrel-macos-x86_64 with 10.13.

I found this confusing, and pressed the CRAN macOS maintainer to clarify but
apparently did so in an insuffciently convincing manner. (There was a word
about it being emailed to r-sig-mac which is a list I am not on as I don't
have a macOS machine.) So in case anybody else wonders, my hope is that the
above is of help. At my day job, we will now switch to 11.0 to take advantage
of some more recent C++ features.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] problems with Maintainers in DESCRIPTION file

2023-12-07 Thread Dirk Eddelbuettel


On 7 December 2023 at 20:58, María Olga Viedma Sillero wrote:
| I receive the same note after fixing it, removing it, and checking Authors@R. 
I think the rejection is a false positive.
| 
| Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-x86_64
| Check: CRAN incoming feasibility, Result: NOTE
|   Maintainer: 'Olga Viedma mailto:olga.vie...@uclm.es>>'
  ^^

Compare that with the other 20500 CRAN packages (you can look at all of them
conveniently via https://github.com/cran/) -- your format differs. Instead of

  Authors@R: c(person("Olga", "Viedma", email = 
"olga.vie...@uclm.es", role = c("aut", "cph", 
"cre")),
   person("Carlos Alberto", "Silva", email = 
"c.si...@ufl.edu", role = c("aut", "cph")),
   person("Jose Manuel", "Moreno", email = 
"josem.mor...@uclm.es", role = c("aut", "cph")))

write

  Authors@R: c(person("Olga", "Viedma", email = "olga.vie...@uclm.es", role = 
c("aut", "cph", "cre")),
  person("Carlos Alberto", "Silva", email = "c.si...@ufl.edu", role 
= c("aut", "cph")),
  person("Jose Manuel", "Moreno", email = "josem.mor...@uclm.es", 
role = c("aut", "cph")))

ie remove the  part.

Hth, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] macos x86 oldrel backups?

2023-12-05 Thread Dirk Eddelbuettel


Hi Simon,

On 5 December 2023 at 23:17, Simon Urbanek wrote:
| The high-sierra build packages are currently not built due to hardware 
issues. The macOS version is so long out of support by Apple (over 6 years) 
that it is hard to maintain it. Only big-sur builds are supported at this 
point. Although it is possible that we may be able to restore the old builds, 
it is not guaranteed. (BTW the right mailing list for this is R-SIG-Mac).

Interesting.  And I see at mac.r-project.org the statement

   [...] As of R 4.3.0 we maintain two binary builds:

   - big-sur-x86_64 build supports legacy Intel Macs from macOS 11 (Big Sur) 
and higher
   - big-sur-arm64 build supports arm64 Macs (M1+) from macOS 11 (Big Sur) and 
higher

   We are no longer building binaries for macOS versions before 11 (as they
   are no longer supported by Apple).

so it is official that r-oldrel on macOS is no more for now? So we do not
have to worry about compilation standards lower than 11.0?  Do I have that
correct? 

Dirk
not on r-sig-mac, appreciative of any hints here as that is what we created the 
list for
 
| Cheers,
| Simon
| 
| 
| 
| > On 5/12/2023, at 09:52, Jonathan Keane  wrote:
| > 
| > Thank you to the CRAN maintainers for maintenance and keeping the all
| > of the CRAN infrastructure running.
| > 
| > I'm seeing a long delay in builds on CRAN for r-oldrel-macos-x86_64.
| > I'm currently interested in Arrow [1], but I'm seeing many other
| > packages with similar missing r-oldrel-macos-x86_64 builds (possibly
| > all, I sampled a few packages from [2], but didn't do an exhaustive
| > search) for an extended period.
| > 
| > It appears that this started between 2023-10-21 and 2023-10-22. It
| > looks like AMR [3] has a successful build but xlcutter does not [4]
| > and all the packages I've checked after 2023-10-22 don't have an
| > updated build for r-oldrel-macos-x86_64
| > 
| > Sorry if this is scheduled maintenance, I tried to find an
| > announcement here and on r-project.org but haven't yet found anything
| > indicating this.
| > 
| > [1] - https://cran.r-project.org/web/checks/check_results_arrow.html
| > [2] - 
https://cran.r-project.org/web/packages/available_packages_by_date.html
| > [3] - https://cran.r-project.org/web/packages/AMR/index.html
| > [4] - https://cran.r-project.org/web/packages/xlcutter/index.html
| > 
| > -Jon
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] URL woes at CRAN: Anaconda edition

2023-11-30 Thread Dirk Eddelbuettel


Thank you all -- expecially Aron and Ivan for the deep dive on the underlying
aspect of the hosting of that web property.  And of course to Uwe for
approving the package manually.

For my taste, life is too short for all this. So users be damned, and I have
now removed the badge.  At the end of the day it is better for Uwe et al (and
of course our side) to have these things autoprocess. If this kind of stuff
is in the way, I will just remove it as maintainer.

Thanks again to all.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] URL woes at CRAN: Anaconda edition

2023-11-30 Thread Dirk Eddelbuettel


I added a badge to point to Conda builds for the work repo:

  
[![Anaconda](https://anaconda.org/conda-forge/r-tiledb/badges/version.svg)](https://anaconda.org/conda-forge/r-tiledb)


And as it goes with all good intentions I immediately got punished on the
next upload:

  Found the following (possibly) invalid URLs:
URL: https://anaconda.org/conda-forge/r-tiledb
  From: README.md
  Status: 400
  Message: Bad Request

And *of course* the same URL resolves fine for me in the browser without any
apparent redirect.  Bit of a web newb here but is there anything I can do
short of deleteing the badge?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-17 Thread Dirk Eddelbuettel


Simon,

One more thing: An alert reader pointed out to me that macOS-oldrel has

 r-oldrel-macos-x86_64  r-oldrelmacOS   x86_64  macOS 10.13.6 (17G11023)

in the table at https://cran.r-project.org/web/checks/check_flavors.html. So
this seems to mesh with what the R-on-macOS FAQ says, and switching to 11.0
would appear to at least loose r-oldrel-macos-x86_64 until April, no?

Best,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Dirk Eddelbuettel


Simon,

On 17 November 2023 at 10:43, Simon Urbanek wrote:
| > On 17/11/2023, at 10:28 AM, Dirk Eddelbuettel  wrote:
| > On 17 November 2023 at 09:35, Simon Urbanek wrote:
| > | can you clarify where the flags come from? The current CRAN builds 
(big-sur-x86_64 and big-sur-arm64) use
| > | 
| > | export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX11.sdk
| > | export MACOSX_DEPLOYMENT_TARGET=11.0
| > | 
| > | so the lowest target is 11.0 and it is no longer forced it in the flags 
(so that users can more easily choose their desired targets).
| > 
| > Beautiful, solves our issue.  Was that announced at some point? If so, 
where?
| > 
| 
| I don't see what is there to announce as the packages should be simply using 
flags passed from R and that process did not change.
| 
| That said, the binary target for CRAN has been announced on this list as part 
of the big-sur build announcement:
| https://stat.ethz.ch/pipermail/r-sig-mac/2023-April/014731.html

I don't own or (directly) use macOS hardware so I am not on that list. 
 
| > For reference the R-on-macOS FAQ I consulted still talks about 10.13 at
| > 
https://cran.r-project.org/bin/macosx/RMacOSX-FAQ.html#Installation-of-source-packages
| > 
| >  CC = clang -mmacosx-version-min=10.13
| >  CXX = clang++ -mmacosx-version-min=10.13 -std=gnu++14
| >  FC = gfortran -mmacosx-version-min=10.13
| >  OBJC = clang -mmacosx-version-min=10.13
| >  OBJCXX = clang++ -mmacosx-version-min=10.13
| > 
| > so someone may want to refresh this. It is what I consulted as relevant 
info.
| > 
| 
| It says "Look at file /Library/Frameworks/R.framework/Resources/etc/Makeconf" 
so it is just an example that will vary by build. For example big-sur-arm64 
will give you
| 
| $ grep -E '^(CC|CXX|FC|OBJC|OBJCXX) ' 
/Library/Frameworks/R.framework/Resources/etc/Makeconf
| CC = clang -arch arm64
| CXX = clang++ -arch arm64 -std=gnu++14
| FC = /opt/R/arm64/bin/gfortran -mtune=native
| OBJC = clang -arch arm64
| OBJCXX = clang++ -arch arm64
| 
| Again, this is just an example, no one should be entering such flags by hand 
- that's why they are in Makeconf so packages can use them without worrying 
about the values (see R-exts 1.2: 
https://cran.r-project.org/doc/manuals/R-exts.html#Configure-and-cleanup for 
details).

I recommend you spend a moment with for example the (rather handy) search
capability of GitHub to search through the 'cran' organisation mirroring the
repo.  The package I maintain is far from being the only one setting the flag.

https://github.com/search?q=org%3Acran+mmacosx-version-min&type=code

Best, Dirk

| Cheers,
| Simon
| 
| 
| 
| > Thanks, Dirk
| > 
| > | 
| > | Cheers,
| > | Simon
| > | 
| > | 
| > | 
| > | > On 17/11/2023, at 2:57 AM, Dirk Eddelbuettel  wrote:
| > | > 
| > | > 
| > | > Hi Simon,
| > | > 
| > | > We use C++20 'inside' our library and C++17 in the API. Part of our 
C++17 use
| > | > is now expanding to std::filesystem whose availability is dependent on 
the
| > | > implementation. 
| > | > 
| > | > The compiler tells us (in a compilation using 
-mmacosx-version-min=10.14)
| > | > that the features we want are only available with 10.15.
| > | > 
| > | > Would we be allowed to use this value of '10.15' on CRAN?
| > | > 
| > | > Thanks as always,  Dirk
| > | > 
| > | > 
| > | > [1] 
https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185
| > | > 
| > | > -- 
| > | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > | > 
| > | > __
| > | > R-package-devel@r-project.org mailing list
| > | > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > | > 
| > | 
| > 
| > -- 
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > 
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Dirk Eddelbuettel


Simon,

On 17 November 2023 at 09:35, Simon Urbanek wrote:
| can you clarify where the flags come from? The current CRAN builds 
(big-sur-x86_64 and big-sur-arm64) use
| 
| export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX11.sdk
| export MACOSX_DEPLOYMENT_TARGET=11.0
| 
| so the lowest target is 11.0 and it is no longer forced it in the flags (so 
that users can more easily choose their desired targets).

Beautiful, solves our issue.  Was that announced at some point? If so, where?

For reference the R-on-macOS FAQ I consulted still talks about 10.13 at
https://cran.r-project.org/bin/macosx/RMacOSX-FAQ.html#Installation-of-source-packages

  CC = clang -mmacosx-version-min=10.13
  CXX = clang++ -mmacosx-version-min=10.13 -std=gnu++14
  FC = gfortran -mmacosx-version-min=10.13
  OBJC = clang -mmacosx-version-min=10.13
  OBJCXX = clang++ -mmacosx-version-min=10.13

so someone may want to refresh this. It is what I consulted as relevant info.

Thanks, Dirk

| 
| Cheers,
| Simon
| 
| 
| 
| > On 17/11/2023, at 2:57 AM, Dirk Eddelbuettel  wrote:
| > 
| > 
| > Hi Simon,
| > 
| > We use C++20 'inside' our library and C++17 in the API. Part of our C++17 
use
| > is now expanding to std::filesystem whose availability is dependent on the
| > implementation. 
| > 
| > The compiler tells us (in a compilation using -mmacosx-version-min=10.14)
| > that the features we want are only available with 10.15.
| > 
| > Would we be allowed to use this value of '10.15' on CRAN?
| > 
| > Thanks as always,  Dirk
| > 
| > 
| > [1] 
https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185
| > 
| > -- 
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Dirk Eddelbuettel


Hi Simon,

We use C++20 'inside' our library and C++17 in the API. Part of our C++17 use
is now expanding to std::filesystem whose availability is dependent on the
implementation. 

The compiler tells us (in a compilation using -mmacosx-version-min=10.14)
that the features we want are only available with 10.15.

Would we be allowed to use this value of '10.15' on CRAN?

Thanks as always,  Dirk


[1] 
https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package submission fail

2023-11-13 Thread Dirk Eddelbuettel


On 13 November 2023 at 16:46, Ivan Krylov wrote:
| Hello Christiaan and welcome to R-package-devel!

Seconded but PLEASE do not send large attachments to the list and all its
subscriber. Point us to your code repository, you will like get a kind
response from a volunteer or two peeking at it.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Fortune candidate Re: Issue with R Package on CRAN - OpenMP and clang17

2023-10-31 Thread Dirk Eddelbuettel


On 31 October 2023 at 19:58, Ivan Krylov wrote:
| [...] The computers that helped launch the first
| people into space had 2 kWords of memory, but nowadays you need more
| than 256 MBytes of RAM to launch a bird into a pig and 10 GBytes of
| storage in order to compile a compiler. This is what progress looks
| like.

Fortune!!

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-30 Thread Dirk Eddelbuettel


I have some better news.  While we established that 'in theory' setting the
environment variable OMP_NUM_THREADS would help (and I maintain that it is a
great PITA that CRAN does not do so as a general fix for this issue) it does
*not help* once R is started.  OpenMP only considers the variable once at
startup and does not re-read it.  So we cannot set from R once R has started.

But OpenMP offers a setter (and a getter) for the thread count value.

And using it addresses the issue.  I created a demo package [1] which, when
running on a system with both OpenMP and 'enough cores' (any modern machine
will do) exhibits the warning from R CMD check --as-cran with timing enabled
(i.e. env vars set).  When an additional environment variable 'SHOWME' is set
to 'yes', it successfully throttles via the exposed OpenMP setter.  In our
example, Armadillo uses it to calibrate its thread use, a lower setting is
followed, and the warning is gone.

I will add more convenient wrappers to RcppArmadillo itself. These are
currently in a branch [2] and their use is illustrated in the help page and
example of fastLm demo function [3].  I plan to make a new RcppArmadillo
release with this change in the coming days, the setter and re-setter will
work for any OpenMP threading changes. So if you use RcppArmadillo, this
should help. (And of course there always was RhpcBLASctl doing this too.)

Dirk

[1] https://github.com/eddelbuettel/rcpparmadilloopenmpex
[2] https://github.com/RcppCore/RcppArmadillo/tree/feature/thread_throttle\
[3] 
https://github.com/RcppCore/RcppArmadillo/blob/a8db424bd6aaeda2ceb897142d3c366d9c6591c7/man/fastLm.Rd#L72-L98
[4] https://cran.r-project.org/package=RhpcBLASctl

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-27 Thread Dirk Eddelbuettel


Hi Jouni,

On 27 October 2023 at 13:02, Helske, Jouni wrote:
| Actually, the OMP_NUM_THREADS worked for vignettes and testthat tests, but
| didn't help with the examples. However, I just wrapped the problematic example

Now I am confused.

What is your understanding of why it helps in one place and not the other?

| with \donttest as for some reason this issue only happened with a single
| seemingly simple example (hence the "weird" in the earlier NEWS due to
| frustration, I changed this to the CRAN version).
| 
| Thanks for reminding me about the resetting the number of cores, will fix that
| to the next version.

I have an idea for a RcppArmadillo-based helper function. We can save the
initial values of the environment variable in .onLoad and cache it. A simple
helper function pair can then dial the environment variable down and reset it
to the cached value.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-27 Thread Dirk Eddelbuettel


Jouni,

My CRANberriesFeed reports a new bssm package at CRAN, congratulations for
sorting this out. [1,2] The OMP_NUM_THREADS setting is indeed all it takes,
and it _does_ seem to be read even from a running session: i.e. you can set
this inside an R session and the OpenMP code considers it in the same
process. Good!

As some of us mentioned, your usage pattern of setting
'Sys.setenv("OMP_NUM_THREADS" = 2)' everywhere _leaves_ that value so you
permanently ham-string the behaviour of a session which runs an example or
test of your package: the same session will never get back to 'all cores' by
itself so adding a resetter to the initial value (maybe via on.exit()) may be
a good idea for the next package revision if you have any energy left for
this question :)

Again, congrats for sorting it out, and sorry for the trouble. I long argued
CRAN should set the behaviour-defining environment variable, that
OMP_NUM_THREADS, for the tests and examples it wants to run under reduced
load.  Alas, that's not where we ended up.

Cheers,  Dirk

[1] http://dirk.eddelbuettel.com/cranberries/2023/10/27#bssm_2.0.2

[2] Your NEWS file calls this 'fix weird CRAN issues with parallelisation on
Debian.'. There is nothing 'weird' here (it behaves as designed, computers do
that to us), and it is not just on Debian but on any system where the build
has a) access to OpenMP so uses it and b) measures real time to elapsed time
with a cap of 2 as CRAN does.

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] API client package failing due to API authentication

2023-10-26 Thread Dirk Eddelbuettel


On 26 October 2023 at 11:14, Cole Johanson wrote:
| My package https://github.com/cole-johanson/smartsheetr requires an
| environment variable, the API access token, to run most of the functions.
| The steps for setting this are documented in the README, but my package is
| being auto-rejected by CRAN for failing the examples.
| 
| I have wrapped the examples with the roxygen2 tag *\dontrun*, but it is
| still attempting to run the examples.
| 
| Should I report this as a false positive, or is there a step I am missing?

You should not attempt to run the examples when they could fail e.g. when no
API key is present as it the case for CRAN.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-25 Thread Dirk Eddelbuettel


On 24 October 2023 at 08:15, Dirk Eddelbuettel wrote:
| 
| On 24 October 2023 at 15:55, Ivan Krylov wrote:
| | В Tue, 24 Oct 2023 10:37:48 +
| | "Helske, Jouni"  пишет:
| | 
| | > Examples with CPU time > 2.5 times elapsed time
| | >   user system elapsed ratio
| | > exchange 1.196   0.04   0.159 7.774
| | 
| | I've downloaded the archived copy of the package from the CRAN FTP
| | server, installed it and tried:
| | 
| | library(bssm)
| | Sys.setenv("OMP_THREAD_LIMIT" = 2)
| | data("exchange")
| | model <- svm(
| |  exchange, rho = uniform(0.97,-0.999,0.999),
| |  sd_ar = halfnormal(0.175, 2), mu = normal(-0.87, 0, 2)
| | )
| | system.time(particle_smoother(model, particles = 500))
| | #user  system elapsed
| | #   0.515   0.000   0.073
| | 
| | I set a breakpoint on clone() [*] and got quite a few calls creating
| | OpenMP threads with the following call stack:
| | 
| | #0  clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:52
| | <...>
| | #4  0x77314e0a in GOMP_parallel () from
| | /usr/lib/x86_64-linux-gnu/libgomp.so.1
| |  <-- RcppArmadillo code below
| | #5 0x738f5f00 in
| | arma::eglue_core::apply,
| | arma::eOp, arma::eop_exp>,
| | arma::eop_scalar_times>, arma::eOp,
| | arma::eop_scalar_div_post>, arma::eop_square> > (outP=..., x=...) at
| | .../library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:69
| | #6 0x73a31246 in
| | arma::Mat::operator=,
| | arma::eop_exp>, arma::eop_scalar_times>,
| | arma::eOp, arma::eop_scalar_div_post>,
| | arma::eop_square>, arma::eglue_div> (X=..., this=0x7fff36f0) at
| | .../library/RcppArmadillo/include/armadillo_bits/Proxy.hpp:226
| | #7
| | 
arma::Col::operator=,
| | arma::eop_exp>, arma::eop_scalar_times>,
| | arma::eOp, arma::eop_scalar_div_post>,
| | arma::eop_square>, arma::eglue_div> > ( X=..., this=0x7fff36f0) at
| | .../library/RcppArmadillo/include/armadillo_bits/Col_meat.hpp:535
| |  <-- bssm code below
| | #8  ssm_ung::laplace_iter (this=0x7fff15e0, signal=...) at
| | model_ssm_ung.cpp:310
| | #9  0x73a36e9e in ssm_ung::approximate (this=0x7fff15e0) at
| | .../library/RcppArmadillo/include/armadillo_bits/arrayops_meat.hpp:27
| | #10 0x73a3b3d3 in ssm_ung::psi_filter
| | (this=this@entry=0x7fff15e0, nsim=nsim@entry=500, alpha=...,
| | weights=..., indices=...) at model_ssm_ung.cpp:517
| | #11 0x73948cd7 in psi_smoother (model_=..., nsim=nsim@entry=500,
| | seed=seed@entry=1092825895, model_type=model_type@entry=3) at
| | R_psi.cpp:131
| | 
| | What does arma::eglue_core do?
| | 
| | (gdb) list
| | /* reformatted a bit */
| | library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:64
| |  int n_threads = (std::min)(
| |   int(arma_config::mp_threads),
| |   int((std::max)(int(1), int(omp_get_max_threads(
| |  );
| | (gdb) p arma_config::mp_threads
| | $3 = 8
| | (gdb) p (int)omp_get_max_threads()
| | $4 = 16
| | (gdb) p (char*)getenv("OMP_THREAD_LIMIT")
| | $7 = 0x56576b91 "2"
| | (gdb) p /x (int)omp_get_thread_limit()
| | $9 = 0x7fff
| | 
| | Sorry for misinforming you about the OMP_THREAD_LIMIT environment
| | variable: the OpenMP specification requires the program to ignore
| | modifications to the environment variables after the program has
| | started [**], so it only works if R is started with OMP_THREAD_LIMIT
| | set. Additionally, the OpenMP thread limit is not supposed to be
| | adjusted at runtime at all [***].
| | 
| | Unfortunately for our situation, Armadillo is very insistent in setting
| | its own number of threads from arma_config::mp_threads (which is
| | constexpr 8 unless you set preprocessor directives while compiling it)
| | and omp_get_max_threads (which is the upper bound on the number of
| | threads that cannot be adjusted at runtime).
| | 
| | What I'm about to suggest is a terrible hack, but since Armadillo seems
| | to lack the option to set the number of threads at runtime, there might
| | be no other option.
| | 
| | Before you #include an Armadillo header, every time:
| | 
| | 1. #include  so that the OpenMP functions are declared and the
| | #include guard is set
| | 
| | 2. Define a static inline function get_number_of_threads returning the
| | desired number of threads as an int (e.g. referencing an extern int
| | number_of_threads stored elsewhere)
| | 
| | 3. #define omp_get_max_threads get_number_of_threads
| | 
| | Now if you provide an API for the R code to get and set this number, it
| | should be possible to control the number of threads used by OpenMP code
| | in Armadillo. Basically, a data.table::setDTthreads() for the copy of
| | Armadillo inlined inside your package.
| | 
| | If you then compile your package with a large #define
| | ARMA_OPENMP_THREADS, it will both be able to use more than 8 threads
| | *and* limit itself when needed.
| | 
| | An alternative course of act

Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-24 Thread Dirk Eddelbuettel


On 24 October 2023 at 15:55, Ivan Krylov wrote:
| В Tue, 24 Oct 2023 10:37:48 +
| "Helske, Jouni"  пишет:
| 
| > Examples with CPU time > 2.5 times elapsed time
| >   user system elapsed ratio
| > exchange 1.196   0.04   0.159 7.774
| 
| I've downloaded the archived copy of the package from the CRAN FTP
| server, installed it and tried:
| 
| library(bssm)
| Sys.setenv("OMP_THREAD_LIMIT" = 2)
| data("exchange")
| model <- svm(
|  exchange, rho = uniform(0.97,-0.999,0.999),
|  sd_ar = halfnormal(0.175, 2), mu = normal(-0.87, 0, 2)
| )
| system.time(particle_smoother(model, particles = 500))
| #user  system elapsed
| #   0.515   0.000   0.073
| 
| I set a breakpoint on clone() [*] and got quite a few calls creating
| OpenMP threads with the following call stack:
| 
| #0  clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:52
| <...>
| #4  0x77314e0a in GOMP_parallel () from
| /usr/lib/x86_64-linux-gnu/libgomp.so.1
|  <-- RcppArmadillo code below
| #5 0x738f5f00 in
| arma::eglue_core::apply,
| arma::eOp, arma::eop_exp>,
| arma::eop_scalar_times>, arma::eOp,
| arma::eop_scalar_div_post>, arma::eop_square> > (outP=..., x=...) at
| .../library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:69
| #6 0x73a31246 in
| arma::Mat::operator=,
| arma::eop_exp>, arma::eop_scalar_times>,
| arma::eOp, arma::eop_scalar_div_post>,
| arma::eop_square>, arma::eglue_div> (X=..., this=0x7fff36f0) at
| .../library/RcppArmadillo/include/armadillo_bits/Proxy.hpp:226
| #7
| 
arma::Col::operator=,
| arma::eop_exp>, arma::eop_scalar_times>,
| arma::eOp, arma::eop_scalar_div_post>,
| arma::eop_square>, arma::eglue_div> > ( X=..., this=0x7fff36f0) at
| .../library/RcppArmadillo/include/armadillo_bits/Col_meat.hpp:535
|  <-- bssm code below
| #8  ssm_ung::laplace_iter (this=0x7fff15e0, signal=...) at
| model_ssm_ung.cpp:310
| #9  0x73a36e9e in ssm_ung::approximate (this=0x7fff15e0) at
| .../library/RcppArmadillo/include/armadillo_bits/arrayops_meat.hpp:27
| #10 0x73a3b3d3 in ssm_ung::psi_filter
| (this=this@entry=0x7fff15e0, nsim=nsim@entry=500, alpha=...,
| weights=..., indices=...) at model_ssm_ung.cpp:517
| #11 0x73948cd7 in psi_smoother (model_=..., nsim=nsim@entry=500,
| seed=seed@entry=1092825895, model_type=model_type@entry=3) at
| R_psi.cpp:131
| 
| What does arma::eglue_core do?
| 
| (gdb) list
| /* reformatted a bit */
| library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:64
|  int n_threads = (std::min)(
|   int(arma_config::mp_threads),
|   int((std::max)(int(1), int(omp_get_max_threads(
|  );
| (gdb) p arma_config::mp_threads
| $3 = 8
| (gdb) p (int)omp_get_max_threads()
| $4 = 16
| (gdb) p (char*)getenv("OMP_THREAD_LIMIT")
| $7 = 0x56576b91 "2"
| (gdb) p /x (int)omp_get_thread_limit()
| $9 = 0x7fff
| 
| Sorry for misinforming you about the OMP_THREAD_LIMIT environment
| variable: the OpenMP specification requires the program to ignore
| modifications to the environment variables after the program has
| started [**], so it only works if R is started with OMP_THREAD_LIMIT
| set. Additionally, the OpenMP thread limit is not supposed to be
| adjusted at runtime at all [***].
| 
| Unfortunately for our situation, Armadillo is very insistent in setting
| its own number of threads from arma_config::mp_threads (which is
| constexpr 8 unless you set preprocessor directives while compiling it)
| and omp_get_max_threads (which is the upper bound on the number of
| threads that cannot be adjusted at runtime).
| 
| What I'm about to suggest is a terrible hack, but since Armadillo seems
| to lack the option to set the number of threads at runtime, there might
| be no other option.
| 
| Before you #include an Armadillo header, every time:
| 
| 1. #include  so that the OpenMP functions are declared and the
| #include guard is set
| 
| 2. Define a static inline function get_number_of_threads returning the
| desired number of threads as an int (e.g. referencing an extern int
| number_of_threads stored elsewhere)
| 
| 3. #define omp_get_max_threads get_number_of_threads
| 
| Now if you provide an API for the R code to get and set this number, it
| should be possible to control the number of threads used by OpenMP code
| in Armadillo. Basically, a data.table::setDTthreads() for the copy of
| Armadillo inlined inside your package.
| 
| If you then compile your package with a large #define
| ARMA_OPENMP_THREADS, it will both be able to use more than 8 threads
| *and* limit itself when needed.
| 
| An alternative course of action is compiling your package with #define
| ARMA_OPENMP_THREADS 2 and giving up on more OpenMP threads inside calls
| to Armadillo.

We should work on adding such a run-time setter of the number of cores to
RcppArmadillo so that examples can dial down to 2 cores.  I have been doing
just that in package tiledb (via a setting internal to the TileDB Core
library) for 'ages' now and RcppArmadillo could and should offer the s

Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-20 Thread Dirk Eddelbuettel


On 19 October 2023 at 05:57, Helske, Jouni wrote:
| I am having difficulties in getting the latest version of the bssm 
(https://github.com/helske/bssm) package to CRAN, as the pretest issues a NOTE 
that the package uses too many cores in some of the examples ("Examples with 
CPU time > 2.5 times elapsed time"). I've seen plenty of discussion about this 
issue in relation to the data.table package, but bssm does not use it. Also, 
while bssm uses OpenMP in some functions, these are not called in the example 
in question (?exchange), and by default the number of threads in the 
parallelisable functions is set to 1.
| 
| But I just realised that bssm uses Armadillo via RcppArmadillo, which uses 
OpenMP by default for some elementwise operations. So, I wonder if that could 
be the culprit? However, I would think that in such case there would be many 
other packages with RcppArmadillo encountering the same CRAN issues. Has anyone 
experienced this with their packages which use RcppArmadillo but not 
data.table, or can say whether my guess is correct? I haven't been able to 
reproduce the issue myself on r-hub or my own linux, so I can't really test 
whether setting #define ARMA_DONT_USE_OPENMP helps.

You have some options to control OpenMP.

There is an environment variable (OMP_THREAD_LIMIT), and there is an CRAN
add-on package (RhpcBLASctl) which, if memory serves, also sets this. Looking
at the Armadillo documentation we see another variable (ARMA_OPENMP_THREADS).

I really think CRAN made a mistake here pushing this down on all package
maintainers.  It is too much work, some will get frustrated, some will get it
wrong and I fear in aggregate we end up with less performant software (as
some will 'cave in' and hard-wire single threaded computes). 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Dirk Eddelbuettel


On 18 October 2023 at 08:51, Simon Urbanek wrote:
| John,
| 
| the short answer is it won't work (it defeats the purpose of vignettes).

Not exactly. Everything is under our (i.e. package author) control, and when
we want to replace 'computed' values with cached values we can.

All this is somewhat of a charade. "Of course" we want vignettes to run
tests. But then we don't want to fall over random missing .sty files or fonts
(macOS machines have been less forgiving than others), not to mention compile
time.

So for simplicity I often pre-make pdf vignettes that get included in other
latex code as source. Works great, never fails, CRAN never complained --
which is somewhat contrary to your statement.

It is effectively the same with tests. We all want maximum test surfaces. But
when tests fail, or when they run too long, or [insert many other reasons
here] so many packages run tests conditionally.  Such is life.

Dirk

 
| However, this sounds like a purely hypothetical question - CRAN policies 
allow long-running vignettes if they declared.
| 
| Cheers,
| Simon
| 
| 
| > On 18/10/2023, at 3:02 AM, John Fox  wrote:
| > 
| > Hello Dirk,
| > 
| > Thank you (and Kevin and John) for addressing my questions.
| > 
| > No one directly answered my first question, however, which was whether the 
approach that I suggested would work. I guess that the implication is that it 
won't, but it would be nice to confirm that before I try something else, 
specifically using R.rsp.
| > 
| > Best,
| > John
| > 
| > On 2023-10-17 4:02 a.m., Dirk Eddelbuettel wrote:
| >> Caution: External email.
| >> On 16 October 2023 at 10:42, Kevin R Coombes wrote:
| >> | Produce a PDF file yourself, then use the "as.is" feature of the R.rsp
| >> | package.
| >> For completeness, that approach also works directly with Sweave. Described 
in
| >> a blog post by Mark van der Loo in 2019, and used in a number of packages
| >> including a few of mine.
| >> That said, I also used the approach described by John Harrold and cached
| >> results myself.
| >> Dirk
| >> --
| >> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| >> __
| >> R-package-devel@r-project.org mailing list
| >> https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Dirk Eddelbuettel


John,

On 17 October 2023 at 10:02, John Fox wrote:
| Hello Dirk,
| 
| Thank you (and Kevin and John) for addressing my questions.
| 
| No one directly answered my first question, however, which was whether 
| the approach that I suggested would work. I guess that the implication 
| is that it won't, but it would be nice to confirm that before I try 
| something else, specifically using R.rsp.

I am a little remote here, both mentally and physically. What I might do here
in the case of your long-running vignette, and have done in about half a
dozen packages where I wanted 'certainty' and no surprises, is to render the
pdf vignette I want as I want them locally, ship them in the package as an
included file (sometimes from a subdirectory) and have a five-or-so line
Sweave .Rnw file include it. That works without hassles. Here is the Rnw I
use for package anytime

-
\documentclass{article}
\usepackage{pdfpages}
%\VignetteIndexEntry{Introduction to anytime}
%\VignetteKeywords{anytime, date, datetime, conversion}
%\VignettePackage{anytime}
%\VignetteEncoding{UTF-8}

\begin{document}
\includepdf[pages=-, fitpaper=true]{anytime-intro.pdf}
\end{document}
-

That is five lines of LaTeX code slurping in the pdf (per the blog post by
Mark). As I understand it R.rsp does something similar at the marginal cost
of an added dependency.

Now, as mentioned, you can also 'conditionally' conpute in a vignette and
choose if and when to use a data cache. I think that we show most of that in
the package described in the RJournal piece by Brooke and myself on drat for
data repositories. (We may be skipping the compute when the data is not
accessible. Loading a precomputed set is similar. I may be doing that in the
much older never quite finished gcbd package and its vignette.

Hope this helps, maybe more once I am back home.

Cheers, Dirk
 
| Best,
|   John
| 
| On 2023-10-17 4:02 a.m., Dirk Eddelbuettel wrote:
| > Caution: External email.
| > 
| > 
| > On 16 October 2023 at 10:42, Kevin R Coombes wrote:
| > | Produce a PDF file yourself, then use the "as.is" feature of the R.rsp
| > | package.
| > 
| > For completeness, that approach also works directly with Sweave. Described 
in
| > a blog post by Mark van der Loo in 2019, and used in a number of packages
| > including a few of mine.
| > 
| > That said, I also used the approach described by John Harrold and cached
| > results myself.
| > 
| > Dirk
| > 
| > --
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Dirk Eddelbuettel


On 16 October 2023 at 10:42, Kevin R Coombes wrote:
| Produce a PDF file yourself, then use the "as.is" feature of the R.rsp 
| package.

For completeness, that approach also works directly with Sweave. Described in
a blog post by Mark van der Loo in 2019, and used in a number of packages
including a few of mine.

That said, I also used the approach described by John Harrold and cached
results myself.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Cadence of macOS builds at CRAN

2023-09-14 Thread Dirk Eddelbuettel


Simon,

A new package of mine [1] appeared on CRAN on Sep 5. Respecting the one week 
gap,
I made a small update on Sep 12.

Today is Sep 14. There are still no builds for
  macOS r-release (arm64)
  macOS r-oldrel (arm64)
  macOS r-release (x86_64)
but we do have two oldrel releases. Weirder still we have results for
macOS r-release (x86_64) even when the binary is not listed.

There is nothing tricky in the package or it dependencies.  Could you provide
an update of what should and can be expected in the macOS provision? Is this
a matter of intra-CRAN syncing between your builder(s) and the Vienna site?

Thanks as always,  Dirk

[1] https://cran.r-project.org/package=RcppInt64

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] What to do when a package is archived from CRAN

2023-08-31 Thread Dirk Eddelbuettel


On 31 August 2023 at 07:32, SHIMA Tatsuya wrote:
| I submitted prqlr 0.5.1 yesterday, which is almost identical to prqlr 
| 0.5.0, and prqlr is now available again on CRAN.
| Thanks to the CRAN reviewers for their quick reaction.

And it is gone again (per CRANberries). Never a dull moment with CRAN. 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 25 August 2023 at 18:48, Jeff Newmiller wrote:
| You have a really bizarre way of twisting what others are saying, Dirk. I 
have seen no-one here saying 'limit R to 2 threads' except for you, as a way to 
paint opposing views to be absurd.

That's too cute.

Nobody needs to repeat it, and some of us know that "it is law"
as the "CRAN Repository Policy" (which each package uploads
promises to adhere to) says
 
   If running a package uses multiple threads/cores it must never
   use more than two simultaneously: the check farm is a shared
   resource and will typically be running many checks
   simultaneously.

You may find reading the document informative. The source reference
(mirrored for convenience at GH) of that line is

https://github.com/r-devel/r-dev-web/blob/master/CRAN/Policy/CRAN_policies.texi#L244-L246

and the rendered page is at 
https://cran.r-project.org/web/packages/policies.html

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 26 August 2023 at 12:05, Simon Urbanek wrote:
| In reality it's more people running R on their laptops vs the rest of the 
world.

My point was that we also have 'single user on really Yuge workstation'. 

Plus we all know that those users are often not sysadmins, and do not have
our levels of accumulated systems knowledge.

So we should give _more_ power by default, not less.

| [...] they will always be saying blatantly false things like "R is not for 
large data"

By limiting R (and/or packages) to two threads we will only get more of
these.  Our collective call.

This whole thread is pretty sad, actually.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 25 August 2023 at 18:45, Duncan Murdoch wrote:
| The real problem is that there are two stubborn groups opposing each 
| other:  the data.table developers and the CRAN maintainers.  The former 
| think users should by default dedicate their whole machine to 
| data.table.  The latter think users should opt in to do that.

No, it feels more like it is CRAN versus the rest of the world.

Take but one example, and as I may have mentioned elsewhere, my day job
consists in providing software so that (to take one recent example)
bioinformatics specialist can slice huge amounts of genomics data.  When that
happens on a dedicated (expensive) hardware with dozens of cores, it would be
wasteful to have an unconditional default of two threads. It would be the end
of R among serious people, no more, no less. Can you imagine how the internet
headlines would go: "R defaults to two threads". 

And it is not just data.table as even in the long thread over in its repo we
have people chiming in using OpenMP in their code (as data.table does but
which needs a different setter than the data.table thread count).

It is the CRAN servers which (rightly !!) want to impose constraints for when
packages are tested.  Nobody objects to that.

But some of us wonder if settings these defaults for all R user, all the
time, unconditional is really the right thing to do.  Anyway, Uwe told me he
will take it to an internal discussion, so let's hope sanity prevails.

Dirk
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trouble with long-running tests on CRAN debian server

2023-08-25 Thread Dirk Eddelbuettel


On 25 August 2023 at 15:37, Uwe Ligges wrote:
| 
| 
| On 23.08.2023 16:00, Scott Ritchie wrote:
| > Hi Uwe,
| > 
| > I agree and have also been burnt myself by programs occupying the 
| > maximum number of cores available.
| > 
| > My understanding is that in the absence of explicit parallelisation, use 
| > of data.table in a package should not lead to this type of behaviour?
| 
| Yes, that would be my hope, too.

No everybody involved with data.table thinks using 50% is already a
compromise giving up performance, see eg Jan's comment from yesterday (and
everything leading up to it):

   https://github.com/Rdatatable/data.table/issues/5658#issuecomment-1691831704

*You* have a local constraint (that is perfectly reasonable) as *you* run
multiple package tests. So *you* should set a low value for OMP_THREAD_LIMIT.

Many users spend top dollars to have access to high-powered machines for
high-powered analyses. They do want all cores.

There simply cannot be one setting that addresses all situations. Please set
a low limit as your local deployment requires it.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 24 August 2023 at 07:42, Fred Viole wrote:
| Hi, I am receiving a NOTE upon submission regarding the re-building of
| vignettes for CPU time for the Debian check.
| 
| I am unable to find any documented instances or solutions to this issue.
| The vignettes currently build in 1m 54.3s locally and in 56s on the Win
| check.
| 
| 
https://win-builder.r-project.org/incoming_pretest/NNS_10.1_20230824_132459/Debian/00check.log

Please see, inter alia, the long running thread

   "Trouble with long-running tests on CRAN debian server"

started earlier this week (!!) on this list covering exactly this issue.

We can only hope CRAN comes to understand our point that _it_ should set a
clearly-identifable variable (the OpenMP thread count would do) so that
package data.table can this for its several hundred users.

As things currently stand, CRAN expects several hundred packages (such as
your, guessing there this comes from data.table which I do not know for sure
but you do import it) to make the change which is pretty close to the text
book definition of madness.

Also see https://github.com/Rdatatable/data.table/issues/5658 with by now 24
comments.  It is on the same issue.

Uwe, Kurt: Please please please set OMP_THREAD_LIMIT to 2 on the Windows and
Debian machines doing this test.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Setting valgrind options when running R CMD check --use-valgrind

2023-08-23 Thread Dirk Eddelbuettel


On 23 August 2023 at 16:49, Duncan Murdoch wrote:
| On 23/08/2023 2:54 p.m., Dirk Eddelbuettel wrote:
| > 
| > When I invoke valgrind via
| > R -d valgrind -e '...'
| > the options in the file ~/.valgrindrc are being picked up. Good.
| > 
| > When I invok valgrind via
| > R CMD check --use-valgrind ...
| > the options in the file ~/.valgrindrc are NOT being picked up. Bad.
| > 
| > And valgrind complains.  How can I add the needed options?  Adding
| > --debugger-args=""
| > does not work.  Is there another trick?
| 
| I don't know the answer to your question, but here's something to try. 
| There's a way to run an "R CMD check" equivalent from a regular session, 
| so presumably it could be done from "R -d valgrind -e":
| 
|  tools:::.check_packages(c("pkg", "--option1", "--option2"))
| 
| A likely problem is that many of the check tests are run in separate 
| processes; I don't know if the valgrind setting would be inherited or not.

Thanks for the reminder, I also re-realized by re-reading WRE that setting
VALGRIND_OPTS="" works.  And with that I am no longer fully sure I can
claim that ~/.valgrindrc was ignored.  I may have misread an error.

Thanks for the prompt help, it is appreciated.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Setting valgrind options when running R CMD check --use-valgrind

2023-08-23 Thread Dirk Eddelbuettel


When I invoke valgrind via
   R -d valgrind -e '...'
the options in the file ~/.valgrindrc are being picked up. Good.

When I invok valgrind via
   R CMD check --use-valgrind ...
the options in the file ~/.valgrindrc are NOT being picked up. Bad.

And valgrind complains.  How can I add the needed options?  Adding
   --debugger-args=""
does not work.  Is there another trick?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trouble with long-running tests on CRAN debian server

2023-08-21 Thread Dirk Eddelbuettel


On 21 August 2023 at 16:05, Ivan Krylov wrote:
| Dirk is probably right that it's a good idea to have OMP_THREAD_LIMIT=2
| set on the CRAN check machine. Either that, or place the responsibility
| on data.table for setting the right number of threads by default. But
| that's a policy question: should a CRAN package start no more than two
| threads/child processes even if it doesn't know it's running in an
| environment where the CPU time / elapsed time limit is two?

Methinks that given this language in the CRAN Repository Policy

  If running a package uses multiple threads/cores it must never use more
  than two simultaneously: the check farm is a shared resource and will
  typically be running many checks simultaneously.

it would indeed be nice if this variable, and/or equivalent ones, were set.

As I mentioned before, I had long added a similar throttle (not for
data.table) in a package I look after (for work, even). So a similar
throttler with optionality is below. I'll add this to my `dang` package
collecting various functions.

A usage example follows. It does nothing by default, ensuring 'full power'
but reflects the minimum of two possible options, or an explicit count:

> dang::limitDataTableCores(verbose=TRUE)
Limiting data.table to '12'.
> Sys.setenv("OMP_THREAD_LIMIT"=3); dang::limitDataTableCores(verbose=TRUE)
Limiting data.table to '3'.
> options(Ncpus=2); dang::limitDataTableCores(verbose=TRUE)
Limiting data.table to '2'.
> dang::limitDataTableCores(1, verbose=TRUE)
Limiting data.table to '1'.
>

That makes it, in my eyes, preferable to any unconditional 'always pick 1 
thread'.

Dirk


##' Set threads for data.table respecting possible local settings
##'
##' This function set the number of threads \pkg{data.table} will use
##' while reflecting two possible machine-specific settings from the
##' environment variable \sQuote{OMP_THREAD_LIMIT} as well as the R
##' option \sQuote{Ncpus} (uses e.g. for parallel builds).
##' @title Set data.table threads respecting default settingss
##' @param ncores A numeric or character variable with the desired
##' count of threads to use
##' @param verbose A logical value with a default of \sQuote{FALSE} to
##' operate more verbosely
##' @return The return value of the \pkg{data.table} function
##' \code{setDTthreads} which is called as a side-effect.
##' @author Dirk Eddelbuettel
##' @export
limitDataTableCores <- function(ncores, verbose = FALSE) {
if (missing(ncores)) {
## start with a simple fallback: 'Ncpus' (if set) or else 2
ncores <- getOption("Ncpus", 2L)
## also consider OMP_THREAD_LIMIT (cf Writing R Extensions), gets NA if 
envvar unset
ompcores <- as.integer(Sys.getenv("OMP_THREAD_LIMIT"))
## and then keep the smaller
ncores <- min(na.omit(c(ncores, ompcores)))
}
stopifnot("Package 'data.table' must be installed." = 
requireNamespace("data.table", quietly=TRUE))
stopifnot("Argument 'ncores' must be numeric or character" = 
is.numeric(ncores) || is.character(ncores))
if (verbose) message("Limiting data.table to '", ncores, "'.")
data.table::setDTthreads(ncores)
}

| 
| -- 
| Best regards,
| Ivan
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


  1   2   3   4   5   6   7   8   >