Hi
I have a matrix with 30 observations and roughly 30000 variables, each obs belongs to one of two groups. With svm and slda I get into memory troubles ('cannot allocate vector of size' roughly 2G). PCA LDA runs fine. Are there any way to use the memory issue withe SVM's? Or can you recommend any other classification method for such huge datasets?



P.S. I run suse 9.1 on a 2G RAM PIV machine. thanks for a hint

Christoph

______________________________________________
[EMAIL PROTECTED] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to