Please post on the r-sig-mixed-models list, where you are more likely to
find the requisite expertise.
However, FWIW, I think the reviewer's request is complete nonsense (naïve
cross validation requires iid sampling). But the mixed models experts are
the authorities on such judgments (and may tell
Hello,
So, I have this (simplified for better understanding) binomial mixed
effects model [library (lme4)]
Mymodel <- glmer(cross.01 ~ stream.01 + width.m + grass.per + (1|
structure.id),
data = Mydata, family = binomial)
stream is a factor with 2 levels; width.m is continuous; grass.per is a
see ?cv.glm under the heading "Value". The help files tell you what comes
out.
On Fri, May 28, 2010 at 10:19 PM, azam jaafari wrote:
> Hi
>
>
> Finally, I did leave-one-out cross validation in R for prediction error of
> logistic regression by cv.glm. But I don't know what are the produced
> dat
Hi
Finally, I did leave-one-out cross validation in R for prediction error of
logistic regression by cv.glm. But I don't know what are the produced
data(almost 700)? does delta show me error estimation?
cost<-function(a,b)mean(abs(a-b))
#SALIC=binary response
salic.lr<-glm(profilesample$SALI
Hi
I did leave-one-out cross validation in R for prediction error of logistic
regression by cv.glm. But I don't know what are the produced data? does delta
show me error estimation?
please help me
Thanks alot
--- On Wed, 5/26/10, Joris Meys wrote:
From: Joris Meys
Subject: Re: [R
You don't seem to be making any corrections or updating your code.
There remains a syntax error in the last line of cvhfunc because of
mismatched parens.
On Jun 20, 2009, at 1:04 PM, muddz wrote:
Hi David,
Thanks and I apologize for the lack of clarity.
#n is defined as the length of xda
Hi David,
Thanks and I apologize for the lack of clarity.
##n is defined as the length of xdat
n<-length(xdat)
#I defined 'k' as the Gaussian kernel function
k<-function(v) {1/sqrt(2*pi)*exp(-v^2/2)} #GAUSSIAN kernal
#I believe ypred in my case, was the leave one out estimator (I think its
the
On Jun 19, 2009, at 7:45 PM, muddz wrote:
Hi Uwe,
My apologies.
Please if I can be guided what I am doing wrong in the code. I
started my
code as such:
# ypred is my leave one out estimator of x
Estimator of x? Really?
cvhfunc<-function(y,x,h){
ypred<-0
for (i in 1:n){
Hi Uwe,
My apologies.
Please if I can be guided what I am doing wrong in the code. I started my
code as such:
#ypred is my leave one out estimator of x
cvhfunc<-function(y,x,h){
ypred<-0
for (i in 1:n){
for (j in 1:n){
if (j!=i){
ypred<-ypred+(y[i]*k((x[j]-x[i])/h))/k((x[j]-x[i])/h)
}
}}
ypred
See the posting guide:
If you provide commented, minimal, self-contained, reproducible code
some people may be willing to help on the list.
Best,
Uwe Ligges
muddz wrote:
Hi All,
I have been trying to get this LOO-Cross Validation method to work on R for
the past 3 weeks but have had no luck
Hi All,
I have been trying to get this LOO-Cross Validation method to work on R for
the past 3 weeks but have had no luck. I am hoping someone would kindly help
me.
Essentially I want to code the LOO-Cross Validation for the 'Local Constant'
and 'Local Linear' constant estimators. I want to fin
Alex Roy wrote:
Dear Frank,
Thanks for your comments. But in my situation, I do
not have any future data and I want to calculate Mean Square Error for
prediction on future data. So, is it not it a good idea to go for LOO?
thanks
Alex
With resampling you should be able t
Dear Frank,
Thanks for your comments. But in my situation, I do not
have any future data and I want to calculate Mean Square Error for
prediction on future data. So, is it not it a good idea to go for LOO?
thanks
Alex
On Tue, Feb 24, 2009 at 7:15 PM, Frank E Harrell Jr <
f.har
Hi Alex,
Give a look at:
http://search.r-project.org/cgi-bin/namazu.cgi?query=leave+one+out&max=20&result=normal&sort=score&idxname=Rhelp02a&idxname=functions&idxname=docs
Cheers
miltinho astronauta
brazil
On Tue, Feb 24, 2009 at 3:07 PM, Alex Roy wrote:
> Dear R user,
>
Alex Roy wrote:
Dear R user,
I am working with LOO. Can any one who is working
with leave one out cross validation (LOO) could send me the code?
Thanks in advance
Alex
I don't think that LOO adequately penalizes for model uncertainty. I
recommend the bootstrap or 50
Dear R user,
I am working with LOO. Can any one who is working
with leave one out cross validation (LOO) could send me the code?
Thanks in advance
Alex
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing
Please look at ?lm.influence -- this does all the work for you.
On Thu, 10 Jan 2008, Anu Swatantran wrote:
> Hi
>
> I am trying to validate my regression results using the leave one out crosss
> validation method. Is any script available in R to use this method for a
> linear regression equation?
Hi
I am trying to validate my regression results using the leave one out crosss
validation method. Is any script available in R to use this method for a
linear regression equation?
Both R or SPLUS would do. any clues on how to write the script will also
help.
Thanks a lot,
A
[[alternativ
18 matches
Mail list logo