Hello,
I am running a preliminary data-processing job on solaris. Basically, the program reads a large number of ascii-lines (about 1M) by blocks (size 25000 lines), selects suitable observations (around 5000) and writes the results into a dataframe for latter processing.
The problem is that when I run the job interactively (in emacs/ESS),
everything works fine. If I try to run it from shell using R CMD
BATCH, R will be killed after around 100 000 lines. Solaris says
simply "killed" and thats it. The total memory consumption seems not
to be a problem, around 200M on a 4G machine.
200M of the data size processed or 200M of RAM consumption by R? Does the .Rout file tell you anything interesting?
200M RAM is nothing, hence only an issue if there are quotas defined on your system.
Uwe Ligges
BTW: R-1.8.1 is recent.
Any suggestions? Has anyone else seen something similar?
Best wishes
Ott --- this is R 1.7.1, compiled as 32-bit appication, on 64bit sparc solaris 9 (sunos 5.9).
______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html