On Wed, 8 Oct 2014, Rainer M Krug wrote:
"Charles C. Berry" <ccbe...@ucsd.edu> writes:
On Wed, 8 Oct 2014, Rainer M Krug wrote:
"Charles C. Berry" <ccbe...@ucsd.edu> writes:
On Mon, 6 Oct 2014, Rainer M Krug wrote:
Hi
The variable transfer of tables from org to R caused sometimes 'could
not find function "read.table"' errors (e.g. when the file was tangled
into a ./data directory which was loaded by the function
devtools::load_all("./")). This can easily be fixed by adding the package
name to the call in R, i.e. replacing =read.table()= with
=utils::read.table()= which is done in this patch.
It does fix that one case.
But I wonder if that is the best way.
The heart of the matter is that load_all eventually calls sys.source,
which can be persnickety about finding objects on the search path. See
?sys.source.
If the src block you tangle to ./data/ has any code that uses any
other objects from utils, stats, datasets or whatever, you will be in
the same pickle.
Exactly - that is true. But it is the same when putting this in a
package (as far as I am aware).
Do you mean that putting `x <- rnorm(10)' into a data/*.R file will
fail when you try to build and check?
In fact, `R CMD build' will execute it and save the result as a
data/*.rda file. And check will go through.
devtools::load_all (calling load_data) fails to do that. Which is why
I think this is a devtools issue.
OK - point taken. But I still think that the =utils::read.table()= would
not hurt, rather make the variable transfer safer.
What you want to change is in a defconst. So, the user can override with a
file-local version.
So, making the change really is harmless.
Maybe add a note to the docstring to say that using `utils::read.table'
assures that `read.table' always can be found just in case anyone ever
asks.
Chuck