Hi Ott,

could it be a per process "limit" or "ulimit"s set?

i don't have a solaris box here so i can't remember what the appropriate commands are...

cheers,
Sean


Ott Toomet wrote:
Hello,

I am running a preliminary data-processing job on solaris.  Basically,
the program reads a large number of ascii-lines (about 1M) by blocks
(size 25000 lines), selects suitable observations (around 5000) and
writes the results into a dataframe for latter processing.

The problem is that when I run the job interactively (in emacs/ESS),
everything works fine. If I try to run it from shell using R CMD
BATCH, R will be killed after around 100 000 lines. Solaris says
simply "killed" and thats it. The total memory consumption seems not
to be a problem, around 200M on a 4G machine.


Any suggestions? Has anyone else seen something similar?

Best wishes

Ott
---
this is R 1.7.1, compiled as 32-bit appication, on 64bit sparc solaris
9 (sunos 5.9).

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html



______________________________________________ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to