Re: [R] comparing classification methods: 10-fold cv or leaving-one-out ?

2004-01-06 Thread Tony Plate
I would recommend reading the following: Dietterich, T. G., (1998). Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms. Neural Computation, 10 (7) 1895-1924. http://web.engr.oregonstate.edu/~tgd/publications/index.html The issues in comparing methods are

Re: [R] comparing classification methods: 10-fold cv or leaving-one-out ?

2004-01-06 Thread Prof Brian Ripley
Leave-one-out is very inaccurate for some methods, notably trees, but fine for some others (e.g. LDA) if used with a good measure of accuracy. Hint: there is a very large literature on this, so read any good book on classification to find out what is known. On Tue, 6 Jan 2004, Christoph Lehmann

[R] comparing classification methods: 10-fold cv or leaving-one-out ?

2004-01-06 Thread Christoph Lehmann
Hi what would you recommend to compare classification methods such as LDA, classification trees (rpart), bagging, SVM, etc: 10-fold cv (as in Ripley p. 346f) or leaving-one-out (as e.g. implemented in LDA)? my data-set is not that huge (roughly 200 entries) many thanks for a hint Christoph --