RE: [R] Label using equivalent of \mathbb{R}

2004-08-26 Thread Trenkler, Dietrich


> -Original Message-
> From: Simon Cullen 
> Sent: Thursday, August 26, 2004 1:26 PM
> To:   [EMAIL PROTECTED]
> Subject:  [R] Label using equivalent of \mathbb{R}
> 
> Hi,
> 
> I'm trying to label the horizontal axis of a plot with a symbol that is  
> the equivalent of \mathbb{R} in LaTeX. I've had a look through the help  
> pages for plotmath and for Hershey and haven't found the symbol. Could  
> someone give me a pointer, please?
> 
[Dietrich Trenkler]  

> plot(rnorm(10),xlab=expression(bold(x)),ylab=expression(bold(y)))

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Two factor ANOVA with lm()

2004-08-23 Thread Trenkler, Dietrich


> -Original Message-
> From: Prof Brian Ripley 
> Sent: Monday, August 23, 2004 1:15 PM
> To:   Trenkler, Dietrich
> Subject:  Re: [R] Two factor ANOVA with lm()
> 
> On Mon, 23 Aug 2004, Trenkler, Dietrich wrote:
> 
> [...]
> 
> > outset?  Or put another way:  Why is it that lm() uses the corner point
> > constraints by default?  Where can I find a documentation for this
> > behavior?
> 
> In almost any piece of documentation on linear models in R, including the
> FAQ and `An Introduction to R', which says
> 
>   The main reason for mentioning this is that R and S have different
>   defaults for unordered factors, S using Helmert contrasts.  So if you 
>   need to compare your results to those of a textbook or paper which used
>   S-PLUS, you will need to set
> 
> options(contrasts = c("contr.helmert", "contr.poly"))
> 
>   This is a deliberate difference, as treatment contrasts (R's default)
>   are thought easier for newcomers to interpret.
> 
> Now, what does the posting guide say about doing your homework?
 
[Dietrich Trenkler]  I swear I didn't need it for a homework.
I just overlooked the self-evident... (blush)

Thank you.

D. Trenkler

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Two factor ANOVA with lm()

2004-08-23 Thread Trenkler, Dietrich
The following is a data frame


> "jjd" <- structure(list(Observations = c(6.8, 6.6, 5.3, 6.1,
  7.5, 7.4, 7.2, 6.5, 7.8, 9.1, 8.8, 9.1), LevelA = structure(c(1,
  1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3), .Label = c("A1", "A2",
  "A3"), class = "factor"), LevelB = structure(c(1, 1, 2, 2,
  1, 1, 2, 2, 1, 1, 2, 2), .Label = c("B1", "B2"), class = "factor")),
  .Names = c("Observations", "LevelA", "LevelB"), row.names = c("1",
  "2", "3", "4", "5", "6", "7", "8", "9", "10", "11", "12"),
  class = "data.frame")

representing data from


@BOOK{Dobson02,
  author = {Annette J. Dobson},
  year = 2002,
  title = {An Introduction to Generalized Linear Models},
  edition = {2.},
  publisher = {Chapman \& Hall/CRC},
  address = {Boca Raton, Florida, 33431}
}

page 101. To reproduce the estimates c(6.7,0.75,1.75,-1.0,0.4,1.5)
given on page 103 in a two factor ANOVA  entering


> jja1 <- lm(Observations~LevelA*LevelB,data=jjd)
> summary(jja1)

I get


Call:
lm(formula = Observations ~ LevelA * LevelB, data = jjd)

Residuals:
   Min 1Q Median 3QMax
-6.500e-01 -2.000e-01 -3.469e-17  2.000e-01  6.500e-01

Coefficients:
  Estimate Std. Error t value Pr(>|t|)
(Intercept) 6.7000 0.3512  19.078 1.34e-06 ***
LevelAA20.7500 0.4967   1.510   0.1818
LevelAA31.7500 0.4967   3.524   0.0125 *
LevelBB2   -1. 0.4967  -2.013   0.0907 .
LevelAA2:LevelBB2   0.4000 0.7024   0.569   0.5897
LevelAA3:LevelBB2   1.5000 0.7024   2.136   0.0766 .
---
Signif. codes:  0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1

Residual standard error: 0.4967 on 6 degrees of freedom
Multiple R-Squared: 0.9065, Adjusted R-squared: 0.8286
F-statistic: 11.64 on 5 and 6 DF,  p-value: 0.00481


This is fine. But why do I get these estimates?


Entering

> model.matrix(jja1)

delivers


   (Intercept) LevelAA2 LevelAA3 LevelBB2 LevelAA2:LevelBB2
LevelAA3:LevelBB2
11000 0
0
21000 0
0
31001 0
0
41001 0
0
51100 0
0
61100 0
0
71101 1
0
81101 1
0
91010 0
0
10   1010 0
0
11   1011 0
1
12   1011 0
1
attr(,"assign")
[1] 0 1 1 2 3 3
attr(,"contrasts")
attr(,"contrasts")$LevelA
[1] "contr.treatment"

attr(,"contrasts")$LevelB
[1] "contr.treatment"


which shows that internally lm() seems to use corner point constraints
of the form

\[\alpha_1=\beta_1=(\alpha\beta)_{11}=
(\alpha\beta)_{12}=(\alpha\beta)_{12}=(\alpha\beta)_{31}=0\]


in the model $E[Y_{jkl}]=\mu+\alpha_j+\beta_k+(\alpha\beta)_{jk}$
$j=1,2,3$, $k=1,2$, $l=1,2$, Dobson, page 102.


My question is:  how can I incorporate restrictions like
$\alpha_1+\alpha_2+\alpha_3=0$, $\beta_1+\beta_2=0$,
$(\alpha\beta)_{21}+\alpha\beta)_{22}=0$,
$(\alpha\beta)_{31}+(\alpha\beta)_{32}=0$ and
$(\alpha\beta)_{11}+(\alpha\beta)_{21}+(\alpha\beta)_{31}=0$ from the
outset?  Or put another way:  Why is it that lm() uses the corner point
constraints by default?  Where can I find a documentation for this
behavior?

I know that I can use something like lm(y~X) where y <- c(6.8, 6.6,
5.3, 6.1, 7.5, 7.4, 7.2, 6.5, 7.8, 9.1, 8.8, 9.1) and X is an
appropriate design matrix.  But I wonder if there is a more direct way.


Many thanks in advance.

D. Trenkler 


--
Dietrich Trenkler   Universität Osnabrück  
FB Wirtschaftswissenschaften   
Rolandstr.8  D-49069 Osnabrück

[EMAIL PROTECTED]

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Erlang distribution

2004-08-19 Thread Trenkler, Dietrich


> -Original Message-
> From: Nils Aschenbruck 
> Sent: Thursday, August 19, 2004 11:51 AM
> To:   [EMAIL PROTECTED]
> Subject:  [R] Erlang distribution
> 
> 
> Hello,
> 
> is there a packet that supports the Erlang distribution?
> 
> I want to use this distribution for tests against empirical data. Thus, is
> there a packet that also supports "fitdistr" (Maximum-likelihood fitting)
> for this distribution?
> 
[Dietrich Trenkler]  Hi Nils,

you do not need a special package because the Erlang is a special
gamma distribution.
For instance

"derlang" <- function(x, k, l = 1) {
f <- dgamma(x, k, l)
f
}

delivers the density of the Erlang(k,1) distribution.

HTH

D. Trenkler

__
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Bug in qnorm or pnorm?

2004-08-06 Thread Trenkler, Dietrich


> -Original Message-
> From: Deepayan Sarkar 
> Sent: Friday, August 06, 2004 3:31 PM
> To:   [EMAIL PROTECTED]
> Subject:  Re: [R] Bug in qnorm or pnorm?
> 
> On Friday 06 August 2004 08:13, Trenkler, Dietrich wrote:
> 
> > Given that pnorm(8.30) delivers 1 shouldn't we get Inf
> > for  x<-8.30;x-qnorm(pnorm(x)) ?
> 
> Why? 
> 
> > pnorm(8.30)
> [1] 1
> > qnorm(pnorm(8.30)) ## same as qnorm(1)
> [1] Inf
> > 8.30 - qnorm(pnorm(8.30)) ## same as 8.30 - Inf
> [1] -Inf
> 
> This seems perfectly acceptable to me for all reasonable definitions of 
> Inf.
> 
> Deepayan
> 
> 
[Dietrich Trenkler]  Yes of course, you're right. I meant 
qnorm(pnorm(8.3))  should deliver Inf -- as it does.  
Thank you.

Dietrich.

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Bug in qnorm or pnorm?

2004-08-06 Thread Trenkler, Dietrich


> -Original Message-
> From: Trenkler, Dietrich 
> Sent: Friday, August 06, 2004 3:13 PM
> To:   'r-help'
> Subject:  [R] Bug in qnorm or pnorm?
> 
> I found the following strange behavior  using qnorm() and pnorm():
> 
> > x<-8.21;x-qnorm(pnorm(x))
> [1] 0.0004638484
> > x<-8.22;x-qnorm(pnorm(x))
> [1] 0.01046385
> > x<-8.23;x-qnorm(pnorm(x))
> [1] 0.02046385
> > x<-8.24;x-qnorm(pnorm(x))
> [1] 0.03046385
> > x<-8.25;x-qnorm(pnorm(x))
> [1] 0.04046385
> > x<-8.26;x-qnorm(pnorm(x))
> [1] 0.05046385
> > x<-8.27;x-qnorm(pnorm(x))
> [1] 0.06046385
> > x<-8.28;x-qnorm(pnorm(x))
> [1] 0.07046385
> > x<-8.29;x-qnorm(pnorm(x))
> [1] 0.08046385
> > x<-8.30;x-qnorm(pnorm(x))
> [1] -Inf
> 
> 
> Given that pnorm(8.30) delivers 1 shouldn't we get Inf 
> for  x<-8.30;x-qnorm(pnorm(x)) ?
> 
> Thanks in advance.
> 
[Dietrich Trenkler]  Oops, forgot to mention:

> unlist(R.Version())
 platform  archos
system 
"i386-pc-mingw32""i386" "mingw32"   "i386,
mingw32" 
   status major minor
year 
   ""   "1" "9.1"
"2004" 
month   day  language 
 "06"  "21"   "R" 

D. Trenkler

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Bug in qnorm or pnorm?

2004-08-06 Thread Trenkler, Dietrich
I found the following strange behavior  using qnorm() and pnorm():

> x<-8.21;x-qnorm(pnorm(x))
[1] 0.0004638484
> x<-8.22;x-qnorm(pnorm(x))
[1] 0.01046385
> x<-8.23;x-qnorm(pnorm(x))
[1] 0.02046385
> x<-8.24;x-qnorm(pnorm(x))
[1] 0.03046385
> x<-8.25;x-qnorm(pnorm(x))
[1] 0.04046385
> x<-8.26;x-qnorm(pnorm(x))
[1] 0.05046385
> x<-8.27;x-qnorm(pnorm(x))
[1] 0.06046385
> x<-8.28;x-qnorm(pnorm(x))
[1] 0.07046385
> x<-8.29;x-qnorm(pnorm(x))
[1] 0.08046385
> x<-8.30;x-qnorm(pnorm(x))
[1] -Inf


Given that pnorm(8.30) delivers 1 shouldn't we get Inf 
for  x<-8.30;x-qnorm(pnorm(x)) ?

Thanks in advance.

Dietrich Trenkler

--
Dietrich Trenkler   Universität Osnabrück  
FB Wirtschaftswissenschaften   
Rolandstr.8  D-49069 Osnabrück

[EMAIL PROTECTED]

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] GHK simulator

2004-07-15 Thread Trenkler, Dietrich
Dear R-community,

not to re-invent the wheel I wonder if someone of you
has ever written a function to compute the GHK smooth recursive
simulator to estimate multivariate normal probabilities. See for instance
page 194 of

@BOOK{Greene97,
  author = {William H. Greene},
  year = 1997,
  title = {Econometric Analysis},
  edition = {3rd},
  publisher = {Prentice-Hall},
  address = {New Jersey 07458}
}


Thank you.   

Dietrich Trenkler

--
Dietrich Trenkler   Universität Osnabrück  
FB Wirtschaftswissenschaften   
Rolandstr.8  D-49069 Osnabrück

[EMAIL PROTECTED]

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] plotting a table together with graphs

2004-07-13 Thread Trenkler, Dietrich


> -Original Message-
> From: Federico Calboli 
> Sent: Tuesday, July 13, 2004 6:06 PM
> To:   r-help
> Subject:  [R] plotting a table together with graphs
> 
> Dear All,
> 
> I would like to ask how to add a table to a "matrix" of graphs.
> 
> I have three non linear regression graphs plotted together after:
> 
> par(mfrow=c(2,2))
> 
> which leaves an empty bottom right corner. I would like to use the space
> to add a table (at the moment that's problem number one, adding a "nice"
> table will come later). I know it is possible to print tables through
> LaTeX and the Design/Hmisc libraries, although I would not have a clue
> about printing it together with graphs, but I'd like something "quicker"
> if at all possible.
> 
[Dietrich Trenkler]  I understand you are using LaTeX. 
So have a look at the psfrag package.

HTH

D. Trenkler

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] help on ks.test

2004-05-06 Thread Trenkler, Dietrich


> -Original Message-
> From: Peter Dalgaard 
> Sent: Thursday, May 06, 2004 4:32 PM
> To:   Janete Borges
> Cc:   [EMAIL PROTECTED]
> Subject:  Re: [R] help on ks.test
> 
> "Janete Borges" <[EMAIL PROTECTED]> writes:
> 
> > Dear All
> > 
> > I need to test the goodness-of-fit of a (Negative) Exponential
> Distribution
> > to a dataset. The parameter of the distribution is unknown. What is the
> > appropriate test to do? I've tried the ks.test, although I think this
> > isn't the appropriate one, as I don't know the population parameter. 
> > Can anybody help me?
> >  
> > Thanks in advance,
> > Janete
> 
> The bias of the K-S test with estimated parameters is well known to be
> substantial, but I haven't heard about correction terms except (I
> think) for the normal distribution.
 
[Dietrich Trenkler]  There is a Lilliefors-version of the KS-test 
for the exponential distribution. See e.g.

@ARTICLE{Lilliefors69a,
  author = {H. W. Lilliefors},
  year = 1969,
  title = {On the {K}olmogorov-{S}mirnov Test for Exponential
   Distribution with Mean Unknown Variance Unknown},
  journal = {Journal of the American Statistical Association},
  volume = 64,
  pages = {387--389},
  keywords = {Lilliefors Test for Exponentiality; Goodness-of-Fit;
 Kolmogorov's Test}
}

 or

@ARTICLE{Mason86,
  author = {Andrew L. Mason and C.B. Bell},
  year = 1986,
  title = {New {L}illiefors and {S}rinivasan Tables with
Applications},
  journal = {Communications in Statistics, Part B--Simulation and
Computation},
  volume = 15,
  pages = {451--477},
  comment = {BIB 2},
  keywords = {Lilliefors Test; Goodness-of-Fit; Simulation}
}  
 
HTH

Let me stress that the KS-test may not be very powerful.

 Dietrich

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] matrix exponential: M^0

2004-01-20 Thread Trenkler, Dietrich


> -Original Message-
> From: Federico Calboli 
> Sent: Tuesday, January 20, 2004 5:40 PM
> To:   r-help
> Subject:  [R] matrix exponential: M^0
> 
> I would like to ask why the zeroeth power of a matrix gives me a matrix
> of ones rather than the identity matrix:
> 
> > D<-rbind(c(0,0,0),c(0,0,0),c(0,0,0))
> > D<-as.matrix(D)
> > D
>  [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]000
> 
> > D^0
>  [,1] [,2] [,3]
> [1,]111
> [2,]111
> [3,]111
> 
> I would have expected the identity matrix here.
> 
> I find the same result with every other square matrix I used.
[Dietrich Trenkler]  M^0 means appying ^0 to each element of M.
Matrix multiplication can be achieved by A%*%B. In this way A%*%A
is not the same as A^2.

Dietrich

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] weibull test

2002-12-19 Thread Trenkler, Dietrich


> -Original Message-
> From: Meriema BELAIDOUNI 
> Sent: Wednesday, December 18, 2002 11:19 AM
> To:   [EMAIL PROTECTED]
> Subject:  [R] weibull test
> 
> Hello
> What is the appropriate method to test if a given distribution is a
> weibull
> thank you
> meriema
 

[Dietrich Trenkler]  The following articles may be of interest for
you:

@ARTICLE{Chandra81,
  author = {M. Chandra and N.D. Singpurwalla and M.A. Stephens},
  year = 1981,
  title = {Kolmogorov Statistics for Tests of Fit for the
Extreme-Value and
  Weibull Distributions},
  journal = {Journal of the American Statistical Association},
  volume = 76,
  pages = {729--731},
  keywords = {Extreme-Value Distribution; Goodness of Fit;
Kolmogorov-Smirnov
 Tests; Kuiper Statistic; Weibull Distribution}
}


@ARTICLE{Lockhart94,
  author = {Richard A. Lockhart and Michael A. Stephens},
  year = 1994,
  title = {Estimation and Tests of Fit for the Three-Parameter
Weibull
  Distribution},
  journal = {Journal of the Royal Statistical Society B},
  volume = 56,
  pages = {491--500},
  keywords = {Empirical Distribution Function; Empirical
Distribution Function
 Tests; Goodness of Fit; Reliability; Survival Analysis}
}

Hope this helps.

D. Trenkler

--
Dietrich Trenkler   Universität Osnabrück  
FB Wirtschaftswissenschaften   
Rolandstr.8  D-49069 Osnabrück

[EMAIL PROTECTED]

__
[EMAIL PROTECTED] mailing list
http://www.stat.math.ethz.ch/mailman/listinfo/r-help