[Rd] ALARM!!!! Re: [R] regarding large csv file import
hi Jim, if i partition the file, then for further operation like merging the partitioned files and after that doing some analysis on whole data set would again require the same amount of memory. If i am not able to do or if i am not having memory then i feel there should be serious thinking over the issue of memory handling. hence i am also copying this to r-devel list and i would also would like to contribute and write code for memory handling issue. i would like to address this request to the great coders of R that software should be able to run in any amount of memory (except some minimum threshold...bingo). thus i would invite all the great coders to please address this issue and if in any ways i can be helpfull then i am right here. thanks with regards -gaurav "jim holtman" <[EMAIL PROTECTED]> 27-10-06 09:09 PM To "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> cc Subject Re: [R] regarding large csv file import Is the file only numeric, or does it also contain characters? You will get better performance by either using 'scan' , or specifying what the type of each column is with 'colClasses' so that read.csv does not have to guess at the types. You will probably need more memory depending on the type of data. If I assume that it is numeric and that it takes about 6 characters to specify a number, then you have approximately 45M numbers in the file and this will take up 362MB for a single object. You should have at least 3X the size of the largest object to do any processing since copies will have to be made. I would suggest partitioning the file and processing in parts. You can also put it in a database and 'sample' the rows that you want to process. On 10/27/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: hi All, i have a .csv of size 272 MB and a RAM of 512MB and working on windows XP. I am not able to import the csv file. R hangs means it stops responding even SciViews hangs. i am using read.csv(FILENAME,sep=",",header=TRUE). Is there any way to import it. i have tried archives already but i was not able to sense much. thanks in advance Sayonara With Smile & With Warm Regards :-) G a u r a v Y a d a v Assistant Manager, Economic Research & Surveillance Department, Clearing Corporation Of India Limited. Address: 5th, 6th, 7th Floor, Trade Wing 'C', Kamala City, S.B. Marg, Mumbai - 400 013 Telephone(Office): - +91 022 6663 9398 , Mobile(Personal) (0)9821286118 Email(Office) :- [EMAIL PROTECTED] , Email(Personal) :- [EMAIL PROTECTED] DISCLAIMER AND CONFIDENTIALITY CAUTION:\ \ This message and ...{{dropped}} __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. -- Jim Holtman Cincinnati, OH +1 513 646 9390 What is the problem you are trying to solve? DISCLAIMER AND CONFIDENTIALITY CAUTION:\ \ This message and ...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] interactive mode without tty (PR#9320)
Full_Name: Manfred Georg Version: 2.4.0 OS: linux Submission from: (NULL) (128.252.166.190) Hello, I would like to see a command line flag --interactive (--no-interactive) or similar, to explicitly set whether we are in interactive mode on startup (default, of course, remaining whether input is from a tty or not). I had to hack the source so that I could take input from a named pipe and get the behavior I wanted, and that seems silly to me. octave, provides exactly such a flag, which I used, and it worked wonderfully. This is a quick fix, I can provide a patch if you want. Thank you, Manfred __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Error: invalid multibyte string
On 10/28/06, Thomas Lumley <[EMAIL PROTECTED]> wrote: > On Fri, 27 Oct 2006, Henrik Bengtsson wrote: > > > In Section "Package subdirectories" in "Writing R Extensions" [2.4.0 > > (2006-10-10)] it says: > > > > "Only ASCII characters (and the control characters tab, formfeed, LF > > and CR) should be used in code files. Other characters are accepted in > > comments, but then the comments may not be readable in e.g. a UTF-8 > > locale. Non-ASCII characters in object names will normally [1] fail > > when the package is installed. Any byte will be allowed [2] in a > > quoted character string (but \u escapes should not be used), but > > non-ASCII character strings may not be usable in some locales and may > > display incorrectly in others.", where the footnote [2] reads "It is > > good practice to encode them as octal or hex escape sequences". > > > > (Note: ASCII refers (correctly) to the 7-bit ASCII [0-127] and none of > > the 8-bit ASCII extensions [128-255].) > > > > According to sentense about quoted strings, the following R/*.R code > > should still be valid: > > > >pads <- sapply(0:64, FUN=function(x) paste(rep("\xFF", x), collapse="")); > > That looks like it should be valid (at least according to the > documentation), even though it won't run usefully on UTF-F locales. What > you wrote before was: > > >> > On Thu, 26 Oct 2006, Henrik Bengtsson wrote: > >> > > >> > > I'm observing the following on different platforms: > >> > > > >> > >> parse(text='"\\x7F"') > >> > > expression("\177") > >> > >> parse(text='"\\x80"') > >> > > Error: invalid multibyte string > > and that error *is* correct behaviour -- you can't parse() something that > isn't a valid character string. Hmm... are you really sure? That should be a (double) quoted \x80 (four characters + quotes), which has been put in a (single) quoted string where backslash is escaped? Maybe it is more clear to write: > expr <- parse(text='x <- "\\x41"') > eval(expr) > print(x) [1] "A" and same for > expr <- parse(text='x <- "\\x7F"') > eval(expr) > print(x) > expr <- parse(text='x <- "\\x80"') > eval(expr) > print(x) (Unfortunately I can't access the machines that gives me the errors right now, but I assume the error occurs when eval() is called.) /H > > -thomas > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] sources of 2.4.0 patched
[EMAIL PROTECTED] writes: > Hi, > > I have a question about the availability of tarballs for 2.4.0-patched. > > I routinely compile and install fresh versions of R-patched as well as > R-devel every few days. I do it mostly "for fun" but also to check for > possible build problems or problems with my development environment > (Win2000, cygwin plus all the necessary tools including MinGW compilers and > MiKTex). > > I get the tarballs form ftp://ftp.stat.math.ethz.ch/Software/R and I > noticed that the only R-patched there is dated 2006-10-03, while R-devel is > up to 2006-10-25. Did, by any chance, a depository for the R-patched > tarballs move someplace else, or is the 2006-10-03 version the latest one. To my knowledge, neither. Something has gone wrong. The build date on the file is Oct 27... I suspect that the svn update is somehow failing on the machine that does the builds (i.e. it just keeps on building from the 2006-10-03 checkout). -- O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~ - ([EMAIL PROTECTED]) FAX: (+45) 35327907 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Luke Tierney <[EMAIL PROTECTED]> writes: > There are lot of subtle issues involved here. We should think through > carefuly exactly what semantics we want for missing value propagation > before making any changes. Making usage easy at top level is > genearlly a good thing, but for usage within functions eliminating > error messages by making more automated choices may not be a good > thing--it may mask real mistakes. > > There are also issues with the internal implementation if missing > arguments we ned to think carefuly about before exposing them at teh R > level. The fact that internally there is a missing argument token > does not mean it is a good thing to expose that detail at the R level > (and it already is in call objects and creates some issues with > computing on the language. > > Like I said, it's complicated, so let's not leap before we look > carefully. I think I'm beginning to realize what the problem really is: It would (conceptually, at least) be fairly simple to change the current semantics of evaluating an argument to be like if (x is a promise) evaluate its expression part if (value is missing) if (there's a default) x <- evaluate default else x <- .Missing # missing value marker Then, if you have a chain where f1(x) calls f2(x) calls fn(x), and x is missing in the first call, fn() will see the first default for x among f1, f2, , f(n-1) if any exists. One problem with this is when to declare an Error. It may or may not be an error to have something evaluating to .Missing, depending on the context in which the evaluation occurs. I'm pretty sure that is a Bad Thing design-wise. Another problem is that, currently, argument passing works by assigning a promise to the name, which is either a promise to evaluate the default expression or to evaluat the one in the actual argument. With the change, you'd need to be able to evaluate defaults, even when an expression has been passed as an argument, which probably means that promises need to be three-pronged structures (value, expression, default expression), or that you have to keep the formal argument list around in the evaluation frame. > Best, > > luke > > On Fri, 27 Oct 2006, Paul Gilbert wrote: > > > > > Peter Dalgaard wrote: > > >Paul Gilbert <[EMAIL PROTECTED]> writes: > > > > > >>Peter Dalgaard wrote: > >> > >> > >>>Paul Gilbert <[EMAIL PROTECTED]> writes: > >>> > >>> > >>> > >I.e., when x is missing in g, and g calls f(3,x), f will use its > >default value for x. > > > > > Yes, that is the behaviour I am looking for. That is, f should do > what it normal would do if it were called with x missing. > > > >>>But if x has a default in g then that default should presumably be > >>>used? > >>> > >>Yes. The value of x in g would get passed to f, default or otherwise. > >>If that value is something that indicates x is missing, then it should > >>be treated as if it is missing in f. This means f should use its > >>default value, rather than throw an error saying x is missing. > >> > >> > >>>And what if x is given a value in the evaluation frame of g > >>>before it is used by f (which can happen, you know, even after the > >>>evaluation of f has begun)? Now imagine a longer chain of calls. > >>> > >>>I think what you're asking for is essentially dynamic scoping for > >>>missing arguments: you'd have to backtrack along the call chain to > >>>find the first instance where x is either given a value or has a > >>>default. This sounds messy. > >>> > >>> > >>You understand this better than I do, but I don't think I am asking to > >>do this. Currently I think f looks back too far and finds x is > >>missing and g does not have a default value for x, so it throws an > >>error. Why can't f find its own default value for x? > >> > > > >Because it's being told to use the value of the argument instead. I > > think. This stuff is treacherous. E.g. what would you expect from > > this? > > > >g <- function(x) {f <- function(y) {x <<- 1; y} ; f(x)} > >g() > > > > > I'm confused. Neither f nor g have a default here, so I don't think > this is related to what I'm talking about. Currently, in your example, > f find x with a value of 1, and I am not suggesting changing that. I'm > only suggesting that if f finds x is missing, it should look at it's > own default argument. You can add defaults if you want. The thing I was illustrating is that even though a missing x is passed for y, this is not checked until y is needed, by which time x might have become non-missing. The point is that even if you had g <- function(x) {f <- function(y=2) {x <<- 1; y} ; f(x)} then once the f is called, y is a promise to evaluate x in g's frame and the default expression is lost. -- O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph:
Re: [Rd] Idea: Testimonials
Gabor Grothendieck wrote: > It occurred to me that we could have an optional file called TESTIMONIALS > that comes with each package which could be a list of short testimonials from > users indicating success with that package and possibly a few details of > the successful application, e.g. it was used to analyse xyz data. > > Users would be encouraged to send in testimonials. > > This would give new users an immediate idea of what others have > done with any given package. > > It would be up to the authors and maintainer to solicit the testimonials. > The author and maintainer could list their own applications in > that file too if they wished. > > Another possibility is to place the testimonials in the vignette if there > is one. > Hi, Could that kind of things go to the wiki http://wiki.r-project.org/rwiki/doku.php?id=packages:packages Romain -- *mangosolutions* /data analysis that delivers/ Tel +44 1249 467 467 Fax +44 1249 467 468 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] pcre library in R (PR#9319)
Full_Name: Earl Kinney Version: 2.4.0 OS: Red Hat Enterprise Linux 4 Submission from: (NULL) (140.247.116.214) At HMDC, we make use of the cross building tools to build Windows R libraries on Linux. Recently when attempting to update the cross building environment to 2.4.0 we encountered a linker error. The error reference an undefined symbol "__pcre_ucp_findprop". After walking through the code there appear to be a combination of two problems: 1. Some of the PCRE code does not properly wrap ucp calls with appropriate "#ifdef SUPPORT_UCP" directives. I plan to report this to the PCRE developers. 2. In R, the implementations for this ucp function are removed, as R does not link against the entire pcre library. I was able to get around this problem by including the following files directly from the PCRE distribution and adding them to the makefiles: pcre_ucp_searchfuncs.c ucptable.c As this is such a trivial change, if you could please consider adding this to future releases of R with this release of PCRE, I would be most appreciative. Regards, Earl (Bob) Kinney UNIX Systsms Administrator Harvard-MIT Data Center __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] sources of 2.4.0 patched
Hi, I have a question about the availability of tarballs for 2.4.0-patched. I routinely compile and install fresh versions of R-patched as well as R-devel every few days. I do it mostly "for fun" but also to check for possible build problems or problems with my development environment (Win2000, cygwin plus all the necessary tools including MinGW compilers and MiKTex). I get the tarballs form ftp://ftp.stat.math.ethz.ch/Software/R and I noticed that the only R-patched there is dated 2006-10-03, while R-devel is up to 2006-10-25. Did, by any chance, a depository for the R-patched tarballs move someplace else, or is the 2006-10-03 version the latest one. Thanks in advance, Andy __ Andy Jaworski 518-1-01 Process Laboratory 3M Corporate Research Laboratory - E-mail: [EMAIL PROTECTED] Tel: (651) 733-6092 Fax: (651) 736-3122 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Pb with .findInheritedMethods
Hi John, John Chambers wrote: > A problem with callNextMethod, which is caching an inherited method as > if it was not inherited, causing confusion on the next search. Should > be fairly easy to fix, but may be a while before I get time to do so. > > By the way, I hope your simplified example does not reflect what > happens in the actual one. >callNextMethod(.Object) > throws away all the ... arguments to new(), which rather defeats the > purpose of having initialize() methods. Generally, callNextMethod() > should get no arguments or all the arguments it needs, including ... > See ?callNextMethod Thanks for looking at this! Yes it is a simplified version of a real case and here .Object is all what callNextMethod() needs because the initialize method for an "A" object takes no argument other than .Object More generally I don't see what's wrong with not passing to callNextMethod all the arguments coming from the call to new: setClass("A", representation(toto="integer")) setMethod("initialize", "A", function(.Object, toto0) [EMAIL PROTECTED] <- as.integer(toto0); .Object}) new("A", 45.1) setClass("Ab", contains="A") setMethod("initialize", "Ab", function(.Object, x, y) callNextMethod(.Object, x*y+1)) new("Ab", 5, 2) Regards, H. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Idea: Testimonials
It occurred to me that we could have an optional file called TESTIMONIALS that comes with each package which could be a list of short testimonials from users indicating success with that package and possibly a few details of the successful application, e.g. it was used to analyse xyz data. Users would be encouraged to send in testimonials. This would give new users an immediate idea of what others have done with any given package. It would be up to the authors and maintainer to solicit the testimonials. The author and maintainer could list their own applications in that file too if they wished. Another possibility is to place the testimonials in the vignette if there is one. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
There are lot of subtle issues involved here. We should think through carefuly exactly what semantics we want for missing value propagation before making any changes. Making usage easy at top level is genearlly a good thing, but for usage within functions eliminating error messages by making more automated choices may not be a good thing--it may mask real mistakes. There are also issues with the internal implementation if missing arguments we ned to think carefuly about before exposing them at teh R level. The fact that internally there is a missing argument token does not mean it is a good thing to expose that detail at the R level (and it already is in call objects and creates some issues with computing on the language. Like I said, it's complicated, so let's not leap before we look carefully. Best, luke On Fri, 27 Oct 2006, Paul Gilbert wrote: Peter Dalgaard wrote: Paul Gilbert <[EMAIL PROTECTED]> writes: Peter Dalgaard wrote: Paul Gilbert <[EMAIL PROTECTED]> writes: I.e., when x is missing in g, and g calls f(3,x), f will use its default value for x. Yes, that is the behaviour I am looking for. That is, f should do what it normal would do if it were called with x missing. But if x has a default in g then that default should presumably be used? Yes. The value of x in g would get passed to f, default or otherwise. If that value is something that indicates x is missing, then it should be treated as if it is missing in f. This means f should use its default value, rather than throw an error saying x is missing. And what if x is given a value in the evaluation frame of g before it is used by f (which can happen, you know, even after the evaluation of f has begun)? Now imagine a longer chain of calls. I think what you're asking for is essentially dynamic scoping for missing arguments: you'd have to backtrack along the call chain to find the first instance where x is either given a value or has a default. This sounds messy. You understand this better than I do, but I don't think I am asking to do this. Currently I think f looks back too far and finds x is missing and g does not have a default value for x, so it throws an error. Why can't f find its own default value for x? Because it's being told to use the value of the argument instead. I think. This stuff is treacherous. E.g. what would you expect from this? g <- function(x) {f <- function(y) {x <<- 1; y} ; f(x)} g() I'm confused. Neither f nor g have a default here, so I don't think this is related to what I'm talking about. Currently, in your example, f find x with a value of 1, and I am not suggesting changing that. I'm only suggesting that if f finds x is missing, it should look at it's own default argument. La version française suit le texte anglais. This email may contain privileged and/or confidential inform...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel -- Luke Tierney Chair, Statistics and Actuarial Science Ralph E. Wareham Professor of Mathematical Sciences University of Iowa Phone: 319-335-3386 Department of Statistics andFax: 319-335-3017 Actuarial Science 241 Schaeffer Hall email: [EMAIL PROTECTED] Iowa City, IA 52242 WWW: http://www.stat.uiowa.edu__ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
On 10/26/06, Gabor Grothendieck <[EMAIL PROTECTED]> wrote: > This is what I get: > > > as.missing <- force > > f <- function(y, x=1) {cat(missing(x)) ; x} > > g <- function(x=as.missing()) f(3,x) > > g() > FALSEError in as.missing() : argument "x" is missing, with no default > > traceback() > 3: as.missing() > 2: f(3, x) > 1: g() > > traceback() > 3: as.missing() > 2: f(3, x) > 1: g() > > so g did in fact pass the missing to f and it was only f that blew up, > not g. If that's not what you want please explain. > > > On 10/26/06, Paul Gilbert <[EMAIL PROTECTED]> wrote: > > I don't see how this solves the problem. > > > > > as.missing <- force > > > f <- function(y, x=1) {cat(missing(x)) ; x} > > > g <- function(x) f(3,x) > > > g(1) > > FALSE[1] 1 > > > g() > > TRUEError in f(3, x) : argument "x" is missing, with no default > > > > I think I still have to put all the logic in g() to figure out if the > > argument is missing, rather than the nice clean solution of just passing > > the argument along to the function it calls. How does this differ from > > the problem I already have when I specifying the argument as NULL and > > do all the checking in g? > > > > Paul > > > > Gabor Grothendieck wrote: > > > > > You can do it like this: > > > > > >> as.missing <- force > > >> g <- function(x = as.missing()) missing(x) > > >> g(3) > > > > > > [1] FALSE > > > > > >> g() > > > > > > [1] TRUE > > > > > > On 10/24/06, Paul Gilbert <[EMAIL PROTECTED]> wrote: > > > > > >> (I'm not sure if this is a request for a feature, or another instance > > >> where a feature has eluded me for many years.) > > >> > > >> Often I have a function which calls other functions, and may often use > > >> the default arguments to those functions, but needs the capability to > > >> pass along non-default choices. I usually do this with some variation on > > >> > > >> foo <- function(x, foo2Args=NULL or a list(foo2defaults), > > >>foo3Args=NULL or a list(foo3defaults)) > > >> > > >> and then have logic to check for NULL, or use the list in combination > > >> with do.call. It is also possible to do this with ..., but it always > > >> seems a bit dangerous passing all the unnamed arguments along to all the > > >> functions being called, especially when I always seem to be calling > > >> functions that have similar arguments (maxit, eps, start, frequency, > > >> etc). > > >> > > >> It is a situation I have learned to live with, but one of my > > >> co-maintainers just pointed out to me that there should be a good way to > > >> do this in R. Perhaps there is something else I have missed all these > > >> years? Is there a way to do this cleanly? It would be nice to have > > >> something like > > >> > > >> foo <- function(x, foo2Args=as.missing(), foo3Args=as.missing()) > > >> > > >> then the call to foo2 and foo3 could specify foo2Args and foo3Args, but > > >> these would get treated as if they were missing, unless they are given > > >> other values. > > >> > > >> Paul Gilbert > > > > > > > > > La version française suit le texte anglais. > > > > > > > > This email may contain privileged and/or confidential information, and the > > Bank of > > Canada does not waive any related rights. Any distribution, use, or copying > > of this > > email or the information it contains by other than the intended recipient is > > unauthorized. If you received this email in error please delete it > > immediately from > > your system and notify the sender promptly by email that you have done so. > > > > > > > > Le présent courriel peut contenir de l'information privilégiée ou > > confidentielle. > > La Banque du Canada ne renonce pas aux droits qui s'y rapportent. Toute > > diffusion, > > utilisation ou copie de ce courriel ou des renseignements qu'il contient > > par une > > personne autre que le ou les destinataires désignés est interdite. Si vous > > recevez > > ce courriel par erreur, veuillez le supprimer immédiatement et envoyer sans > > délai à > > l'expéditeur un message électronique pour l'aviser que vous avez éliminé de > > votre > > ordinateur toute copie du courriel reçu. > > > How about using this is.missing() instead of missing(): > is.missing <- function(x) { + + mc <- match.call() + mc[[1]] <- as.name("missing") + missing <- eval.parent(mc) + + xc <- deparse(substitute(x)) + fo <- formals(sys.function(-1)) + has.default <- !identical(fo[[xc]], alist(dummy=)[[1]]) + + if (has.default) { +mc <- match.call(sys.function(-1), call = sys.call(-1)) +miss <- !(xc %in% names(mc)) +miss <- miss || missing +if (missing) + eval.parent(substitute(x <- default, list(default = fo[[xc]]))) +p <- parent.frame() +miss + } else missing + } > > g1
Re: [Rd] as.missing
Peter Dalgaard wrote: >Paul Gilbert <[EMAIL PROTECTED]> writes: > > > >>Peter Dalgaard wrote: >> >> >> >>>Paul Gilbert <[EMAIL PROTECTED]> writes: >>> >>> >>> >>> >I.e., when x is missing in g, and g calls f(3,x), f will use its >default value for x. > > > Yes, that is the behaviour I am looking for. That is, f should do what it normal would do if it were called with x missing. >>>But if x has a default in g then that default should presumably be >>>used? >>> >>> >>Yes. The value of x in g would get passed to f, default or otherwise. >>If that value is something that indicates x is missing, then it should >>be treated as if it is missing in f. This means f should use its >>default value, rather than throw an error saying x is missing. >> >> >> >>>And what if x is given a value in the evaluation frame of g >>>before it is used by f (which can happen, you know, even after the >>>evaluation of f has begun)? Now imagine a longer chain of calls. >>> >>>I think what you're asking for is essentially dynamic scoping for >>>missing arguments: you'd have to backtrack along the call chain to >>>find the first instance where x is either given a value or has a >>>default. This sounds messy. >>> >>> >>> >>You understand this better than I do, but I don't think I am asking to >>do this. Currently I think f looks back too far and finds x is >>missing and g does not have a default value for x, so it throws an >>error. Why can't f find its own default value for x? >> >> > >Because it's being told to use the value of the argument instead. I >think. > >This stuff is treacherous. E.g. what would you expect from this? > >g <- function(x) {f <- function(y) {x <<- 1; y} ; f(x)} >g() > > > I'm confused. Neither f nor g have a default here, so I don't think this is related to what I'm talking about. Currently, in your example, f find x with a value of 1, and I am not suggesting changing that. I'm only suggesting that if f finds x is missing, it should look at it's own default argument. La version française suit le texte anglais. This email may contain privileged and/or confidential inform...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Paul Gilbert <[EMAIL PROTECTED]> writes: > Peter Dalgaard wrote: > > >Paul Gilbert <[EMAIL PROTECTED]> writes: > > > > > >>>I.e., when x is missing in g, and g calls f(3,x), f will use its > >>>default value for x. > >>> > >> Yes, that is the behaviour I am looking for. That is, f should do > >> what it normal would do if it were called with x missing. > >> > > > >But if x has a default in g then that default should presumably be > > used? > Yes. The value of x in g would get passed to f, default or otherwise. > If that value is something that indicates x is missing, then it should > be treated as if it is missing in f. This means f should use its > default value, rather than throw an error saying x is missing. > > >And what if x is given a value in the evaluation frame of g > >before it is used by f (which can happen, you know, even after the > >evaluation of f has begun)? Now imagine a longer chain of calls. > > > >I think what you're asking for is essentially dynamic scoping for > >missing arguments: you'd have to backtrack along the call chain to > >find the first instance where x is either given a value or has a > >default. This sounds messy. > > > You understand this better than I do, but I don't think I am asking to > do this. Currently I think f looks back too far and finds x is > missing and g does not have a default value for x, so it throws an > error. Why can't f find its own default value for x? Because it's being told to use the value of the argument instead. I think. This stuff is treacherous. E.g. what would you expect from this? g <- function(x) {f <- function(y) {x <<- 1; y} ; f(x)} g() -- O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~ - ([EMAIL PROTECTED]) FAX: (+45) 35327907 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Duncan Murdoch wrote: > On 10/27/2006 10:24 AM, Peter Dalgaard wrote: > >> Paul Gilbert <[EMAIL PROTECTED]> writes: >> >>> >I.e., when x is missing in g, and g calls f(3,x), f will use its >>> >default value for x. >>> > > >>> Yes, that is the behaviour I am looking for. That is, f should do >>> what it normal would do if it were called with x missing. >> >> >> But if x has a default in g then that default should presumably be >> used? And what if x is given a value in the evaluation frame of g >> before it is used by f (which can happen, you know, even after the >> evaluation of f has begun)? Now imagine a longer chain of calls. >> >> I think what you're asking for is essentially dynamic scoping for >> missing arguments: you'd have to backtrack along the call chain to >> find the first instance where x is either given a value or has a >> default. This sounds messy. > > > I've been meaning to look at the code to see how this is handled now, > but haven't had a chance yet. I would guess at some level there's a > test something like > > if (name not in actualarglist) treat as missing > else treat as present > > I think Paul's suggestion could be implemented by making this just a > bit more complicated: > > if (name not in actualarglist || get(name) == SpecialMissingValue) > treat as missing > else treat as present > > where SpecialMissingValue is what as.missing() returns. > > So if my guess is right, this would be fairly easy to implement. But > it's also possible that this test shows up implicitly in many places, > in which case it would be a lot messier. Great! Let's hope it is not messy. BTW, I don't have any special reason to think as.missing() should be a function. It could be a special defined constant. > > Duncan Murdoch La version française suit le texte anglais. This email may contain privileged and/or confidential inform...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] What to do with a inconsistency in rank() that's in S+ and R ever since?
On Fri, Oct 27, 2006 at 11:14:25AM +0200, Jens Oehlschl?gel wrote: > rather, but in fact NAs seem to be always treated ties.method = > "first". I have no idea in which situation one could desire > e.g. ties.method = "average" except for NAs!? Interesting. I was aware of the S-Plus vs. R difference, but I didn't realize that it appears to be because R rank() ignores ties.method="average" for NA values. > I am aware that the prototype behaves like this and R ever since > behaves like this, however to me this appears very unfortunate. In > order not to 'break' existing code, what about adding ties.methods If you only care about ranking integers and floating point numbers, it's pretty straghtforward to take the S-Plus implementation of rank(), call it to my.rank(), and use it in both R and S-Plus. (Since the R rank() makes calls to .Internal(), you can't re-use its implementation in S-Plus.) Note though that the S-Plus-style my.rank() will still sort strings differently in R than in S-Plus. I never looked into why. Some old notes I have on this issue: R and S-Plus rank() treat NAs differently (which can magnifiy other floating point differences): # S-Plus 6.2.1:# R 2.1.0: > rank(1:5)> rank(1:5) [1] 1 2 3 4 5 [1] 1 2 3 4 5 > rank(c(1,2,NA,4,NA)) > rank(c(1,2,NA,4,NA)) [1] 1.0 2.0 4.5 3.0 4.5[1] 1 2 4 3 5 > rank(c(1,NA,3,4,NA)) > rank(c(1,NA,3,4,NA)) [1] 1.0 4.5 2.0 3.0 4.5[1] 1 4 2 3 5 > rank(c(1,NA,3)) > rank(c(1,NA,3)) [1] 1 3 2 [1] 1 3 2 > rank(c(NA,NA,3)) > rank(c(NA,NA,3)) [1] 2.5 2.5 1.0[1] 2 3 1 -- Andrew Piskorski <[EMAIL PROTECTED]> http://www.piskorski.com/ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Peter Dalgaard wrote: >Paul Gilbert <[EMAIL PROTECTED]> writes: > > > >>>I.e., when x is missing in g, and g calls f(3,x), f will use its >>>default value for x. >>> >>> >>> >>> >>Yes, that is the behaviour I am looking for. That is, f should do what >>it normal would do if it were called with x missing. >> >> > >But if x has a default in g then that default should presumably be >used? > Yes. The value of x in g would get passed to f, default or otherwise. If that value is something that indicates x is missing, then it should be treated as if it is missing in f. This means f should use its default value, rather than throw an error saying x is missing. >And what if x is given a value in the evaluation frame of g >before it is used by f (which can happen, you know, even after the >evaluation of f has begun)? Now imagine a longer chain of calls. > >I think what you're asking for is essentially dynamic scoping for >missing arguments: you'd have to backtrack along the call chain to >find the first instance where x is either given a value or has a >default. This sounds messy. > > You understand this better than I do, but I don't think I am asking to do this. Currently I think f looks back too far and finds x is missing and g does not have a default value for x, so it throws an error. Why can't f find its own default value for x? La version française suit le texte anglais. This email may contain privileged and/or confidential inform...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Error: invalid multibyte string
On Fri, 27 Oct 2006, Henrik Bengtsson wrote: > In Section "Package subdirectories" in "Writing R Extensions" [2.4.0 > (2006-10-10)] it says: > > "Only ASCII characters (and the control characters tab, formfeed, LF > and CR) should be used in code files. Other characters are accepted in > comments, but then the comments may not be readable in e.g. a UTF-8 > locale. Non-ASCII characters in object names will normally [1] fail > when the package is installed. Any byte will be allowed [2] in a > quoted character string (but \u escapes should not be used), but > non-ASCII character strings may not be usable in some locales and may > display incorrectly in others.", where the footnote [2] reads "It is > good practice to encode them as octal or hex escape sequences". > > (Note: ASCII refers (correctly) to the 7-bit ASCII [0-127] and none of > the 8-bit ASCII extensions [128-255].) > > According to sentense about quoted strings, the following R/*.R code > should still be valid: > >pads <- sapply(0:64, FUN=function(x) paste(rep("\xFF", x), collapse="")); That looks like it should be valid (at least according to the documentation), even though it won't run usefully on UTF-F locales. What you wrote before was: >> > On Thu, 26 Oct 2006, Henrik Bengtsson wrote: >> > >> > > I'm observing the following on different platforms: >> > > >> > >> parse(text='"\\x7F"') >> > > expression("\177") >> > >> parse(text='"\\x80"') >> > > Error: invalid multibyte string and that error *is* correct behaviour -- you can't parse() something that isn't a valid character string. -thomas __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
On 10/27/2006 10:24 AM, Peter Dalgaard wrote: > Paul Gilbert <[EMAIL PROTECTED]> writes: > >> >I.e., when x is missing in g, and g calls f(3,x), f will use its >> >default value for x. >> > >> > >> Yes, that is the behaviour I am looking for. That is, f should do what >> it normal would do if it were called with x missing. > > But if x has a default in g then that default should presumably be > used? And what if x is given a value in the evaluation frame of g > before it is used by f (which can happen, you know, even after the > evaluation of f has begun)? Now imagine a longer chain of calls. > > I think what you're asking for is essentially dynamic scoping for > missing arguments: you'd have to backtrack along the call chain to > find the first instance where x is either given a value or has a > default. This sounds messy. I've been meaning to look at the code to see how this is handled now, but haven't had a chance yet. I would guess at some level there's a test something like if (name not in actualarglist) treat as missing else treat as present I think Paul's suggestion could be implemented by making this just a bit more complicated: if (name not in actualarglist || get(name) == SpecialMissingValue) treat as missing else treat as present where SpecialMissingValue is what as.missing() returns. So if my guess is right, this would be fairly easy to implement. But it's also possible that this test shows up implicitly in many places, in which case it would be a lot messier. Duncan Murdoch __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Overloading functions
On Fri, 27 Oct 2006 15:54:40 +0100, Tom McCallum <[EMAIL PROTECTED]> wrote: > On Fri, 27 Oct 2006 14:49:15 +0100, Paul Roebuck <[EMAIL PROTECTED]> > wrote: > >> On Fri, 27 Oct 2006, Tom McCallum wrote: >> >>> I have a function f which does something using a function g. Function >>> f >>> is in a library and g has a default stub in the library but will be >>> mainly >>> overloaded in a later R script. For example: >>> >>> ## In a compiled package 'P' # >>> g <- function() { >>> cat("Original function g"); >>> } >>> >>> f <- function( newGsource=NULL ) { >>> if( is.null(newGsource) == FALSE ) { >>> source( newGsource ); # load new function g >>> } >>> g(); >>> return(1); >>> } >>> # >>> >>> If I call f() then I get "Original function g". >>> >>> But I want to overload g so I do the following in the file newg.R: >>> >>> ### CONTENTS of newg.R ## >>> g <- function() { >>> cat("New function g in newg.R"); >>> } >>> END CONTENTS ### >>> >>> and call f( newGsource="newg.R" ) but I still get "Original function >>> g". >>> >>> Any suggestions? >> >> ?environment >> >> -- >> SIGSIG -- signature too long (core dumped) >> >> > > Thanks for that, have almost figured out how to do it, have got my > namespace but when I "assign" the new value > I get "cannot change value of a locked binding". Is there any way to say > that a particular item in a package > is able to be overridden using assign? > > I assume when I export a function in the NAMESPACE file it is locking the > value to the name. So I assume it is here I need > to change something - if this is even possible to do. > > Cheers > > Tom > found the use of assignInNamespace - done! Thanks for your help. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] RMySQL and stored procedures
Hi, I use RMySQL to connect to MySQL server where I have a couple of stored procedures. I know that the function for the stored procedures "is not yet implemented", but I was trying to get around the problem connecting like that: drv <- dbDriver("MySQL") con <- dbConnect(drv,user=MyUser, password=MyPasswd, dbname=MyDBName, host=MyHost, client.flag="196608" ) where 196608 is a combination of CLIENT_MULTI_RESULTS and CLIENT_MULTI_STATEMENTS as mentioned in the mysql_real_connect() documentation. Without these flags stored procedures calls do not run at all. I also do fetching the results in a loop - as suggested in the doc, so: doStoredProcedureFetch <- function( connection, sql ) { res <- dbSendQuery( connection, sql ) data <- fetch( res, n=-1 ) repeat { fetch( res, n=-1 ) if( dbHasCompleted( res ) ) break ; } data } However, sometimes I get the error message: "RS-DBI driver: (could not run statement: Lost connection to MySQL server during query)" and the connection has been lost indeed. Sometimes it does not happen and I may run yet another query. Looks that happens with specific queries only - plain SELECTs run ok, some CALLs for procedure, too. Some don't. I'm seriously confused :| Could you give ant hints why it happens, how to fix it and what is the problem with stored procedures? Cheers, Michal This email is confidential and intended solely for the use o...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Overloading functions
On Fri, 27 Oct 2006 14:49:15 +0100, Paul Roebuck <[EMAIL PROTECTED]> wrote: > On Fri, 27 Oct 2006, Tom McCallum wrote: > >> I have a function f which does something using a function g. Function f >> is in a library and g has a default stub in the library but will be >> mainly >> overloaded in a later R script. For example: >> >> ## In a compiled package 'P' # >> g <- function() { >> cat("Original function g"); >> } >> >> f <- function( newGsource=NULL ) { >> if( is.null(newGsource) == FALSE ) { >> source( newGsource ); # load new function g >> } >> g(); >> return(1); >> } >> # >> >> If I call f() then I get "Original function g". >> >> But I want to overload g so I do the following in the file newg.R: >> >> ### CONTENTS of newg.R ## >> g <- function() { >> cat("New function g in newg.R"); >> } >> END CONTENTS ### >> >> and call f( newGsource="newg.R" ) but I still get "Original function g". >> >> Any suggestions? > > ?environment > > -- > SIGSIG -- signature too long (core dumped) > > Thanks for that, have almost figured out how to do it, have got my namespace but when I "assign" the new value I get "cannot change value of a locked binding". Is there any way to say that a particular item in a package is able to be overridden using assign? I assume when I export a function in the NAMESPACE file it is locking the value to the name. So I assume it is here I need to change something - if this is even possible to do. Cheers Tom -- --- Tom McCallum __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Paul Gilbert <[EMAIL PROTECTED]> writes: > >I.e., when x is missing in g, and g calls f(3,x), f will use its > >default value for x. > > > > > Yes, that is the behaviour I am looking for. That is, f should do what > it normal would do if it were called with x missing. But if x has a default in g then that default should presumably be used? And what if x is given a value in the evaluation frame of g before it is used by f (which can happen, you know, even after the evaluation of f has begun)? Now imagine a longer chain of calls. I think what you're asking for is essentially dynamic scoping for missing arguments: you'd have to backtrack along the call chain to find the first instance where x is either given a value or has a default. This sounds messy. -- O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~ - ([EMAIL PROTECTED]) FAX: (+45) 35327907 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Bjørn-Helge Mevik wrote: >Gabor Grothendieck wrote: > > > >>This is what I get: >> >> >> >>>as.missing <- force >>>f <- function(y, x=1) {cat(missing(x)) ; x} >>>g <- function(x=as.missing()) f(3,x) >>>g() >>> >>> >>FALSEError in as.missing() : argument "x" is missing, with no default >> >> >>>traceback() >>> >>> >>3: as.missing() >>2: f(3, x) >>1: g() >> >> >>>traceback() >>> >>> >>3: as.missing() >>2: f(3, x) >>1: g() >> >>so g did in fact pass the missing to f and it was only f that blew up, >>not g. If that's not what you want please explain. >> >> > >I _think_ what he wants is: > > > >>g() >> >> >TRUE[1] 1 > >I.e., when x is missing in g, and g calls f(3,x), f will use its >default value for x. > > Yes, that is the behaviour I am looking for. That is, f should do what it normal would do if it were called with x missing. La version française suit le texte anglais. This email may contain privileged and/or confidential inform...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] S4 pb in R 2.5.0
Thanks for the typo, it might have taken a while to find this one otherwise! In 2.5.0 and 2.4patched, the subclass information attempts to be complete. That subclass information needs to be removed from all superclasses when the class is redefined, and currently isn't. The particular superclass link to "vector" is what makes the code think that "A" is meant to be the data part of the new class. When you redefined "A" to no longer contain "integer", that link should have gone away. Looks fairly straightforward to fix but as with your other bug report, may not happen for a few days. Herve Pages wrote: > Hi, > > When playing interactively with the S4 system, I've tried > to define the following class: > > > setClass("A", representation("integer")) > [1] "A" > > showClass("A") > > Slots: > > Name:.Data > Class: integer > > Extends: > Class "integer", from data part > Class "vector", by class "integer", distance 2 > Class "numeric", by class "integer", distance 2 > > then I realized that I made a typo (I don't want to extend > the "integer" type) so I redefined class A: > > > setClass("A", representation(toto="integer")) > > showClass("A") > Slots: > > Name: toto > Class: integer > > Now if I try to extend A: > > > setClass("Aa", representation("A")) > Error in reconcilePropertiesAndPrototype(name, slots, prototype, > superClasses, : > "A" is not eligible to be the data part of another class > (must be a basic class or a virtual class with no slots) > > Surprising. And even more surprising: I don't get this if I don't > try to define class A twice or if I invert the order of the 2 calls > to setClass("A", ...)! > > > sessionInfo() > R version 2.4.0 (2006-10-03) > x86_64-unknown-linux-gnu > > locale: > > LC_CTYPE=en_US;LC_NUMERIC=C;LC_TIME=en_US;LC_COLLATE=en_US;LC_MONETARY=en_US;LC_MESSAGES=en_US;LC_PAPER=en_US;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US;LC_IDENTIFICATION=C > > attached base packages: > [1] "methods" "stats" "graphics" "grDevices" "utils" > "datasets" > [7] "base" > > No problem with R-2.4.0. > > Thanks, > H. > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Pb with .findInheritedMethods
A problem with callNextMethod, which is caching an inherited method as if it was not inherited, causing confusion on the next search. Should be fairly easy to fix, but may be a while before I get time to do so. By the way, I hope your simplified example does not reflect what happens in the actual one. callNextMethod(.Object) throws away all the ... arguments to new(), which rather defeats the purpose of having initialize() methods. Generally, callNextMethod() should get no arguments or all the arguments it needs, including ... See ?callNextMethod Herve Pages wrote: > Hi again, > > This happens with R-2.4.0 and R-devel. > > Cheers, > H. > > Herve Pages wrote: > >> Hi again, >> >> >> Here is a very simplified version of a class hierarchy >> defined in the Biobase package (Bioconductor). I post >> here because this seems to be an S4 related problem: >> >> setClass("A", representation(name="character")) >> setMethod("initialize", "A", function(.Object) [EMAIL PROTECTED] <- "I'm >> an A"; .Object}) >> >> setClass("Ab", contains="A") >> setMethod("initialize", "Ab", function(.Object) callNextMethod(.Object)) >> >> setClass("Abc", contains="Ab") >> >> setClass("Abcd", contains = c("Abc")) >> >> Now if I do: >> >> tmp1 <- new("Abc") >> tmp2 <- new("Abcd") >> >> I get the following warning: >> >> Warning message: >> Ambiguous method selection for "initialize", target "Abcd" (the >> first of the signatures shown will be used) >> Abc >> Ab >> in: .findInheritedMethods(classes, fdef, mtable) >> >> I don't really understand why .findInheritedMethods is >> complaining here... >> And if I don't do 'tmp1 <- new("Abc")' before I >> do 'tmp2 <- new("Abcd")', then I don't get the warning >> anymore! >> >> Does anybody have an explanation for this? >> >> >> Thanks, >> H. >> >> __ >> R-devel@r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> >> >> > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > > [[alternative HTML version deleted]] __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] all.names() and all.vars(): sorting order of functions' return vector
Rewrite expr2 <- expression(x3 <- 0.5 * x1 - 0.7 * x2) like this: expr2 <- expression(`<-`(x3, `-`(`*`(0.5, x1), `*`(0.7, x2 and it becomes clear. On 10/27/06, Pfaff, Bernhard Dr. <[EMAIL PROTECTED]> wrote: > Dear list-subscriber, > > in the process of writing a general code snippet to extract coefficients > in an expression (in the example below: 0.5 and -0.7), I stumbled over > the following peculiar (at least peculiar to me:-) ) sorting behaviour > of the function all.names(): > > > expr1 <- expression(x3 = 0.5 * x1 - 0.7 * x2) > > all.names(expr1) > [1] "-" "*" "x1" "*" "x2" > > all.vars(expr1) > [1] "x1" "x2" > > expr2 <- expression(x3 <- 0.5 * x1 - 0.7 * x2) > > all.names(expr2) > [1] "<-" "x3" "-" "*" "x1" "*" "x2" > > all.vars(expr2) > [1] "x3" "x1" "x2" > > expr3 <- expression(x3 ~ 0.5 * x1 - 0.7 * x2) > > all.names(expr3) > [1] "~" "x3" "-" "*" "x1" "*" "x2" > > all.vars(expr3) > [1] "x3" "x1" "x2" > > > > For all.names(expr2) and all.names(expr3) I would expect something like: > > [1] "x3" "<-" "*" "x1" "-" "*" "x2" > > [1] "x3" "~" "*" "x1" "-" "*" "x2" > > To which kind of rule is the sorting order in the returned character > vector produced? Any help or pointers is much appreciated. > > Best, > Bernhard > > > > sessionInfo() > R version 2.5.0 Under development (unstable) (2006-10-10 r39600) > i386-pc-mingw32 > > locale: > LC_COLLATE=German_Germany.1252;LC_CTYPE=German_Germany.1252;LC_MONETARY= > German_Germany.1252;LC_NUMERIC=C;LC_TIME=German_Germany.1252 > > attached base packages: > [1] "methods" "stats" "graphics" "grDevices" "datasets" "utils" > > [7] "base" > > other attached packages: >nlme fortunes > "3.1-77" "1.3-2" > > > * > Confidentiality Note: The information contained in this mess...{{dropped}} > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] all.names() and all.vars(): sorting order of functions' return vector
Rewrite expr2 <- expression(x3 <- 0.5 * x1 - 0.7 * x2) like this: expression(`<-`(x3, `-`(`*`(0.5, x1), `*`(0.7, x2 and it becomes clear. On 10/27/06, Pfaff, Bernhard Dr. <[EMAIL PROTECTED]> wrote: > Dear list-subscriber, > > in the process of writing a general code snippet to extract coefficients > in an expression (in the example below: 0.5 and -0.7), I stumbled over > the following peculiar (at least peculiar to me:-) ) sorting behaviour > of the function all.names(): > > > expr1 <- expression(x3 = 0.5 * x1 - 0.7 * x2) > > all.names(expr1) > [1] "-" "*" "x1" "*" "x2" > > all.vars(expr1) > [1] "x1" "x2" > > expr2 <- expression(x3 <- 0.5 * x1 - 0.7 * x2) > > all.names(expr2) > [1] "<-" "x3" "-" "*" "x1" "*" "x2" > > all.vars(expr2) > [1] "x3" "x1" "x2" > > expr3 <- expression(x3 ~ 0.5 * x1 - 0.7 * x2) > > all.names(expr3) > [1] "~" "x3" "-" "*" "x1" "*" "x2" > > all.vars(expr3) > [1] "x3" "x1" "x2" > > > > For all.names(expr2) and all.names(expr3) I would expect something like: > > [1] "x3" "<-" "*" "x1" "-" "*" "x2" > > [1] "x3" "~" "*" "x1" "-" "*" "x2" > > To which kind of rule is the sorting order in the returned character > vector produced? Any help or pointers is much appreciated. > > Best, > Bernhard > > > > sessionInfo() > R version 2.5.0 Under development (unstable) (2006-10-10 r39600) > i386-pc-mingw32 > > locale: > LC_COLLATE=German_Germany.1252;LC_CTYPE=German_Germany.1252;LC_MONETARY= > German_Germany.1252;LC_NUMERIC=C;LC_TIME=German_Germany.1252 > > attached base packages: > [1] "methods" "stats" "graphics" "grDevices" "datasets" "utils" > > [7] "base" > > other attached packages: >nlme fortunes > "3.1-77" "1.3-2" > > > * > Confidentiality Note: The information contained in this mess...{{dropped}} > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Overloading functions
Hi Everyone I have a function f which does something using a function g. Function f is in a library and g has a default stub in the library but will be mainly overloaded in a later R script. For example: ## In a compiled package 'P' # g <- function() { cat("Original function g"); } f <- function( newGsource=NULL ) { if( is.null(newGsource) == FALSE ) { source( newGsource ); # load new function g } g(); return(1); } # If I call f() then I get "Original function g". But I want to overload g so I do the following in the file newg.R: ### CONTENTS of newg.R ## g <- function() { cat("New function g in newg.R"); } END CONTENTS ### and call f( newGsource="newg.R" ) but I still get "Original function g". Any suggestions? Tom -- --- Tom McCallum WWW: http://www.tom-mccallum.com Tel: 0131-4783393 Mobile: 07866-470257 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] all.names() and all.vars(): sorting order of functions' return vector
Dear list-subscriber, in the process of writing a general code snippet to extract coefficients in an expression (in the example below: 0.5 and -0.7), I stumbled over the following peculiar (at least peculiar to me:-) ) sorting behaviour of the function all.names(): > expr1 <- expression(x3 = 0.5 * x1 - 0.7 * x2) > all.names(expr1) [1] "-" "*" "x1" "*" "x2" > all.vars(expr1) [1] "x1" "x2" > expr2 <- expression(x3 <- 0.5 * x1 - 0.7 * x2) > all.names(expr2) [1] "<-" "x3" "-" "*" "x1" "*" "x2" > all.vars(expr2) [1] "x3" "x1" "x2" > expr3 <- expression(x3 ~ 0.5 * x1 - 0.7 * x2) > all.names(expr3) [1] "~" "x3" "-" "*" "x1" "*" "x2" > all.vars(expr3) [1] "x3" "x1" "x2" > For all.names(expr2) and all.names(expr3) I would expect something like: [1] "x3" "<-" "*" "x1" "-" "*" "x2" [1] "x3" "~" "*" "x1" "-" "*" "x2" To which kind of rule is the sorting order in the returned character vector produced? Any help or pointers is much appreciated. Best, Bernhard > sessionInfo() R version 2.5.0 Under development (unstable) (2006-10-10 r39600) i386-pc-mingw32 locale: LC_COLLATE=German_Germany.1252;LC_CTYPE=German_Germany.1252;LC_MONETARY= German_Germany.1252;LC_NUMERIC=C;LC_TIME=German_Germany.1252 attached base packages: [1] "methods" "stats" "graphics" "grDevices" "datasets" "utils" [7] "base" other attached packages: nlme fortunes "3.1-77" "1.3-2" > * Confidentiality Note: The information contained in this mess...{{dropped}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] What to do with a inconsistency in rank( ) that's in S+ and R ever since?
Dear R-developers, I just realized that rank() behaves inconsistent if combining one of na.last in {TRUE|FALSE} with a ties.method in {"average"|"random"|"max"|"min"}. The documentation suggests that e.g. with na.last=TRUE NAs are treated like the last (=highest) value, which obviously is not the case: > rank(c(1,2,2,NA,NA), na.last = TRUE, ties.method = c("average", "first", > "random", "max", "min")[1]) [1] 1.0 2.5 2.5 4.0 5.0 I'd expect [1] 1.0 2.5 2.5 4.5 4.5 rather, but in fact NAs seem to be always treated ties.method = "first". I have no idea in which situation one could desire e.g. ties.method = "average" except for NAs!? I am aware that the prototype behaves like this and R ever since behaves like this, however to me this appears very unfortunate. In order not to 'break' existing code, what about adding ties.methods {"NAaverage"|"NArandom"|"NAmax"|"NAmin"} that behave consistently? Best regards Jens Oehlschlägel P.S. Please cc. me, I am not on the list > version _ platform i386-pc-mingw32 arch i386 os mingw32 system i386, mingw32 status major 2 minor 4.0 year 2006 month 10 day03 svn rev39566 language R version.string R version 2.4.0 (2006-10-03) __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] as.missing
Gabor Grothendieck wrote: > This is what I get: > >> as.missing <- force >> f <- function(y, x=1) {cat(missing(x)) ; x} >> g <- function(x=as.missing()) f(3,x) >> g() > FALSEError in as.missing() : argument "x" is missing, with no default >> traceback() > 3: as.missing() > 2: f(3, x) > 1: g() >> traceback() > 3: as.missing() > 2: f(3, x) > 1: g() > > so g did in fact pass the missing to f and it was only f that blew up, > not g. If that's not what you want please explain. I _think_ what he wants is: > g() TRUE[1] 1 I.e., when x is missing in g, and g calls f(3,x), f will use its default value for x. -- Bjørn-Helge Mevik __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel