Hi Achim,
I'd like to try to run the strucchange package on R. However, it
seems that the package can only run on systems running Debian and not
Windows XP. Is this true? Or is there a windows version? I downloaded
the strucchange file but they seem to have the dot deb extension and
seem to be
With an older verion of R (I think 2.2.0) and an older version of Rpad I
used to use HTML(go, collapse=false) where go is list of objects returned by
a function and this worked great. Now that I have done some upgrading (to R
2.3.1 and Rpad 1.1.0) its not working right. I also get a warning when
Dear Listers:
I happened to have a problem requiring time-series clustering since the
clusters will change with time (too old data need to be removed from data
while new data comes in). I am wondering if there is some paper or reference
on this topic and there is some kind of implementation in R?
Uwe and Ben,
Thank you both for your help.
To me, both sets of code seem to do the job and should produce the
same results. However, as a test I inserted set.seed( ) as follows.
Unless I put set.seed( ) in the wrong lines, the results produced by
both sets of code turn out to be different. I am
Greg,
It might be better to use ylim = c(min(x), 0)
in place of ylim = c(-4, 0). I don't think plt can take
negative values.
Peter Ehlers
Greg Snow wrote:
> Here is one approach. It uses the function clipplot which is shown
> below (someday I will add this to my TeachingDemos package, the
> Te
Em Qui, 2006-06-01 às 05:59 -0700, Ahamarshan jn escreveu:
> hi list,
>
> This question must be very basic but I am just 3 days
> old to R. I am trying to find a function to merge two
> tables of data in two different files as one.
>
> Does merge function only fills in colums between two
> table
Rolf Turner <[EMAIL PROTECTED]> writes:
> Peter Dalgaard wrote:
>
> > Rolf Turner <[EMAIL PROTECTED]> writes:
> >
> > > summary(object)$cov.unscaled
> >
> > You need to multiply that with sigma. However, vcov(object) is easier.
>
> Well, I thought unscaled meant unscaled --- the plain
>
Hi,
I am attempting to build R on Alpha servers running Tru64 UNIX.
I am having problems building R 2.3.0 so am attempting to build older
releases.
I have been able to build R 1.9.1 without problem. However "make
check" fails for the following tests:
1. arith.Rout
The difference
Mike
The source of the problem is not obvious to me
from what you've reported
and I could ot reproduce in a few tests with other data.
Would it be possoble to send me the data-set?
Cheers
P.J.
On Fri, 2 Jun 2006, Michael Fuller wrote:
Date: Fri, 2 Jun 2006 14:54:46 -0400
From: Michael Fulle
Peter Dalgaard wrote:
> Rolf Turner <[EMAIL PROTECTED]> writes:
>
> > summary(object)$cov.unscaled
>
> You need to multiply that with sigma. However, vcov(object) is easier.
Well, I thought unscaled meant unscaled --- the plain
unvarnished covariance matrix! I figure that mult
or more simply and better,
vcov(lm.object)
?vcov
Note R's philosophy:use available extractors to get the key features of the
objects, rather then indexing. This is safer, as it does not depend on the
particular structure/implementation, which can change. This is the
difference between "private"
On Fri, 2 Jun 2006 18:24:57 -0300 (ADT) Rolf Turner wrote:
> summary(object)$cov.unscaled
As the name suggests: this is the unscaled covariance matrix!
Use the vcov() extractor method, i.e.,
vcov(object)
which has methods not only for "lm" but also many other fitted model
objects.
Z
_
Rolf Turner <[EMAIL PROTECTED]> writes:
> summary(object)$cov.unscaled
You need to multiply that with sigma. However, vcov(object) is easier.
--
O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B
c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
(*) \(*) -- Un
summary(object)$cov.unscaled
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Hi,
I am running a simple linear model with (say) 5 independent variables. Is
there a simple way of getting the variance-covariance matrix of the
coeffcient estimates? None of the values of the lm() seem to provide this.
Thanks in advance,
Ritwik Sinha
[EMAIL PROTECTED]
Grad Student
Case Weste
Ah, so I switch y with x, give it x, and ask it for y (which is actually x).
Clever.
I continue to be impressed with R.
Thanks, Larry
On Friday June 2 2006 11:23, Gabor Grothendieck wrote:
> Try this:
>
> approx(p[,"lwr"], 1:5, 3)
>
> On 6/2/06, Larry Howe <[EMAIL PROTECTED]> wrote:
> > Hello,
To follow up on Bert Gunter's remark about a "quicker and dirtier" way,
here is the R script I use to source every *.r file in a directory and
save the results into a workspace. It excludes any files beginning with
"00" (for example, itself) from the operations. This simplifies
maintaining a
Here is one approach. It uses the function clipplot which is shown
below (someday I will add this to my TeachingDemos package, the
TeachingDemos package is required).
An example of usage:
> x <- rnorm(1:100)
> plot(x, type='l',col='red')
> clipplot( lines(x, col='blue'), ylim=c(-4,0) )
Hope thi
To get the ANCOVA table you need to look at the anova() function.
The variables T and L must be factors to get the multi-degree-of-freedom
anova tables you are looking for.
Order matters. You get the same residual, but the sequential sums of
squares differ.
bt.aov <- aov(E ~ B + T)
anova(bt.ao
Dear R-friends,
Is there someone workingo with Landscape Ecology Metrics in R?
I´m writing a short routine to compute a "Percolation" index in a map and I
need to identify "Clumpy" in the image. My input data looks like
000
0011000
0011100
000
0111
Dear R user:
I have a question about doing ANCOVA in S-plus or R.
I know that many users use lm to do the regression and check the ANCOVA. But
is there a way to get the traditional Table form of the ANCOVA test through
S-plus (like what we would get from SPSS or SAS)?
The probl
Hi,
Is there a way of presetting my R environment so that, for example
everything between -1e-15 and -1e-15 be equal to 0.
Thanks in advance
Guilaume Blanchet
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEAS
Hi how can I plot a series of number as a line, but
with lines above a threshould as one color, and with
lines below the threshold as another color. for
example, a numeric vector: rnorm(1:100), and plot
these numbers in the original order, but lines above
the horizontal line at 0 are in red, and li
I'm using R for Mac OSX version 1.14 (2129) and the geoR package
version 1.6-5 (the current version in the R repository). I'm running
R in OS 10.4.6 on a Mac G4 iBook (933MHz, 640 MB DDR SDRAM). I
searched the R archive and did not find a posting on this issue.
I want to use the variog and
Dear members,
I'm getting an error with the "integrate" function. Searching in the r-help
archives, I think this may have something to do with the function (it is not
returning a vector but a number), but I don't see exactly what.
The function to integrate was defined with a for loop first:
Rainer M KRug krugs.de> writes:
>
> R 2.3.0
> Linux, SuSE 10.0
>
> Hi
>
> I have two problems with mle - probably I am using it the wrong way so
> please let me know.
>
> I want to fit different distributions to an observed count of seeds and
> in the next step use AIC or BIC to identify the
On 6/1/06, Maria Montez <[EMAIL PROTECTED]> wrote:
> Hi.
>
> My data contains information for 10 hospitals for 12 different measures.
> Let's call it x1-x12. I need to create a boxplot for each one of this
> measures and put them into one page. Each plot also needs to be independent,
> i.e. cannot
On Fri, 2 Jun 2006, Roger D. Peng wrote:
> Running on 64-bit per se does not make things faster. In fact, from my
> experience it sometimes makes things slower. The advantage with 64-bit
> is the extra address space for storing things in memory.
See the R-admin manual for some additional comm
Running on 64-bit per se does not make things faster. In fact, from my
experience it sometimes makes things slower. The advantage with 64-bit is the
extra address space for storing things in memory.
Of course, today's 64-bit chips are all faster than recent 32-bit chips, so you
will get a spe
Try
save(A, B, file = "myfun.r")
attach("myfun.r")
Your functions will be on the search list.
-roger
Matthias Braeunig wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Hi,
>
> how can I automatically access the functions that I loaded into a
> separate environment?
>
>> save(A,B,
On Fri, 2 Jun 2006, Matthias Braeunig wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Hi,
>
> how can I automatically access the functions that I loaded into a
> separate environment?
This is a topic that is probably best suited to R-devel: the explanations
are going to get technica
The benchmark report is good stuff - I've been wondering about these
speed issues recently myself.
Has anyone tried something similar on 64-bit Linux or other OS? I'm
contemplating switching to 64-bit Linux if I'll get some dramatic cycle
time improvements. Anyone have any experience with this?
Romain Lorrilliere wrote:
> Hi,
>
> I made an R function, and I want make an executable applet with it. Do
> you know how it is possible?
>
More details about what you want would be helpful. Here
is what I do and it may be useful to you.
Under *nix,OSX, etc use bash's "here document" feature.
Try this:
attach(as.list(ENV))
search() # shows it on the path
A function's environment refers to where variables in the function
are looked for if they can't find the variable in the function itself.
The environment in which a function itself is located is entirely
different.
e.g.
a <- 1
e <-
Larry Howe wrote:
> Hello,
>
> Is there a way for R to determine the point where a confidence interval
> equals
> a specified value? For example:
>
> x = seq(1:5)
> y = c(5, 5, 4, 4, 3)
> lm = lm(y ~ x)
> p = predict.lm(lm, interval="confidence")
> matplot(p, type="b")
> abline(h = 3)
You ca
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hi,
how can I automatically access the functions that I loaded into a
separate environment?
> save(A,B,file="myfun.r")
> load("myfun.r",envir=(ENV<-new.env()))
> ls(ENV)
[1] "A" "B"
?"[" turned up that I can access the functions via
> ENV$A
functio
Note that creating separate variables out of your data frame is
not really a good idea. However, to answer your question:
# this puts it on the path
iris.lc <- iris
names(iris.lc) <- tolower(names(iris.lc))
attach(iris.lc)
search() # shows where its located
print(head(sepal.length))
detach() # ge
You (and your colleague) might want to have a look at
http://www.sciviews.org/benchmark/. It's a bit dated,
but still may be a good starting point.
Some months ago some one asked about working on getting
R to use the GPU for computation on the R-devel list.
Don't know if anything came of it.
Che
names(DataTABLE) <- tolower(names(DataTABLE))
?casefold
?names
Milton Cezar wrote:
> Dear All,
>
> I have read a table using
> DataTABLE <- read.table("mytable.txt, header=T)
>
> And get the following data structure
>Var1 VAR2 VaR3 Var4 ...
>
> How can I list
Try this:
approx(p[,"lwr"], 1:5, 3)
On 6/2/06, Larry Howe <[EMAIL PROTECTED]> wrote:
> Hello,
>
> Is there a way for R to determine the point where a confidence interval equals
> a specified value? For example:
>
> x = seq(1:5)
> y = c(5, 5, 4, 4, 3)
> lm = lm(y ~ x)
> p = predict.lm(lm, interval
I don't understand the question. What's wrong with showing 26.94350 as
being less than all the other leiji values?
Peter Ehlers
zhang jian wrote:
> I think there is a question in R. I donot know the reason.
> This is my data about comulative percentage figure. The result is not right.
>
> The
Hello,
Is there a way for R to determine the point where a confidence interval equals
a specified value? For example:
x = seq(1:5)
y = c(5, 5, 4, 4, 3)
lm = lm(y ~ x)
p = predict.lm(lm, interval="confidence")
matplot(p, type="b")
abline(h = 3)
I want to answer the question: "What is the value o
tolower(names(DataTABLE))
Peter Ehlers
Milton Cezar wrote:
> Dear All,
>
> I have read a table using
> DataTABLE <- read.table("mytable.txt, header=T)
>
> And get the following data structure
>Var1 VAR2 VaR3 Var4 ...
>
> How can I list all collumn names (in l
R 2.3.0
Linux, SuSE 10.0
Hi
I have two problems with mle - probably I am using it the wrong way so
please let me know.
I want to fit different distributions to an observed count of seeds and
in the next step use AIC or BIC to identify the best distribution.
But when I run the script below (whic
Dear All,
I have read a table using
DataTABLE <- read.table("mytable.txt, header=T)
And get the following data structure
Var1 VAR2 VaR3 Var4 ...
How can I list all collumn names (in lowcase) and create variables from table
collumns. By hand I do
var1 <- D
dear R wizards:
while extolling the virtues of R, one of my young econometrics
colleagues told me that he still wants to run ox because [a] his code
is written in it (good reason); [b] because ox seems to be faster than
R in most benchmarks (huh?).
this got me to wonder. language speed can't mat
The source R files comes with the package
for instance, in this case, in a Linux machine you can locate it with:
[EMAIL PROTECTED]:~$ locate lm.glm
/usr/lib/R/library/stats/demo/lm.glm.R
In other OS this will be similar, look for the "stats" package
best
P.J.
On Fri, 2 Jun 2006, Edward Dow
Try this:
file.show(system.file("demo/lm.glm.R", package = "stats"))
On 6/2/06, Edward Downie <[EMAIL PROTECTED]> wrote:
> I see the output of running, for instance, demo(lm.glm)
> just fine, but how do I view the set of commands and the
> data that served as the input?
>
> Thanks for the hel
I don't think this is exactly what you want but try:
par(mfrow=c(1,2))
vioplot(normal, names="Normal", horizontal=TRUE,
ylim=c(-12,12))
vioplot(uniform, names ="Uniform", horizontal=TRUE,
ylim=c(-12,12))
--- Karin Lagesen <[EMAIL PROTECTED]>
wrote:
> Michael Dondrup
> <[EMAIL PROTECTED]> write
I see the output of running, for instance, demo(lm.glm)
just fine, but how do I view the set of commands and the
data that served as the input?
Thanks for the help.
Edward Downie
__
R-help@stat.math.ethz.ch mailing list
https://
Michael Conklin markettools.com> writes:
>
> I have searched through the help files and I have been unsuccessful in
> solving this problem.
>
> I am trying to create a small wrapper function that will go around a
> call to a plot function and create a windows metafile in the directory
> and als
Uwe Ligges statistik.uni-dortmund.de> writes:
>
> xpRt.wannabe wrote:
>
> > y <- replicate(10,replicate(8,sum(rnorm(rpois(1,5)
> >
> > x - max(0,x-15) + max(0,x-90), where x represents the individual Normal
> > numbers.
>
> y <- replicate(10, {
> rp <- rpois(8, 5)
> mys
Marc Bernard yahoo.fr> writes:
> I am looking for a function to extract, from an nlme object, the estimated
variance-covariance matrix of the random effects.
VarCorr
D.Menne
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/l
Dear all,
I am looking for a function to extract, from an nlme object, the estimated
variance-covariance matrix of the random effects.
many thanks,
Bernard,
__
[[alternative HTML version deleted]]
___
Hi Zhang Jian,
If I say plot(t$no,t$leiji), then the lower bound is "neatly" set at
about 25... (I'm not sure how I can measure the bounds on the current
plot - but I'm sure it can be found!)
You can set the bounds on the y-axis to be between 0 and 100 by saying
plot(fre$no,fre$leiji,ylim=c(0,100
I think there is a question in R. I donot know the reason.
This is my data about comulative percentage figure. The result is not right.
The first point (no=1,leiji=26.94350) in the plot figure was showen in a
lower location. Why?
Thanks !
> fre
no leiji
1 1 26.94350
2 2 46.86401
3
Yes! Now I can see it, it's so evident... Thank you so much,
m
Mihai Nica, ABD
Jackson State University
ITT Tech Instructor
170 East Griffith Street G5
Jackson, MS 39201
601 914 0361
The least of learning is done in the classrooms.
- Thomas Merton
Original Message Follows
From: Prof
Dear All,
I am using the pmst function from the sn package (version 0.4-0). After
inserting the example from the help page, I get non-trivial answers, so
everything is fine. However, when I try to extend it to higher dimension:
xi <- alpha <- x <- rep(0,27)
Omega <- diag(0,27)
p1 <- pmst(x, xi, Ome
Hi
On 2 Jun 2006 at 10:26, Uwe Ligges wrote:
Date sent: Fri, 02 Jun 2006 10:26:44 +0200
From: Uwe Ligges <[EMAIL PROTECTED]>
Organization: Fachbereich Statistik, Universitaet Dortmund
To: Tim Alcon <[EMAIL PROTECTED]>
Copies to:
xpRt.wannabe wrote:
> Dear List:
>
> I have the follow code:
>
> y <- replicate(10,replicate(8,sum(rnorm(rpois(1,5)
>
> Now I need to apply the following condition to _every_ randomly generated
> Normal number in the code above:
>
> x - max(0,x-15) + max(0,x-90), where x represents the ind
Tim Alcon wrote:
> I have a large array and would like to extract from it the row and
> column indices just of values for which a particular boolean condition
Do you mean a matrix, i.e. exactly 2 dimensions?
> is true. I assume there's a simple way to do this, but I haven't
> figured it out y
I have a large array and would like to extract from it the row and
column indices just of values for which a particular boolean condition
is true. I assume there's a simple way to do this, but I haven't
figured it out yet. Any help would be appreciated.
Tim
__
62 matches
Mail list logo