In article <[EMAIL PROTECTED]>, Wesley Henwood <[EMAIL PROTECTED]> wrote: >To capture output from python scripts run from a C++ app I've added the >following code at the beggening of the C++ app: > >PyRun_SimpleString("import grabber"); >PyRun_SimpleString("import sys"); >PyRun_SimpleString("class a:\n\tdef >write(self,s):\n\t\tograbber.grab(s)\n"); >PyRun_SimpleString("import sys\nsys.stderr=a()\nsys.stdout=a()"); > >Its hard to read that way, here's what it expands to: >import grabber >import sys >class a: > def write(self, s) > grabber.grab(s) > >grabber is a C++ extension, the grab function prints displays the >captured text in a Windows app. After running about 450+ scripts in a >row, I get "IOError Errno 24 Too many open files." > >I've searched this group and the net and determined that stderr and >stdout may open files, is that correct? If so would each running of a >script be opening new files related to stderr and stdout and not >closing them? I'm just guessing.
I'm guessing, but it sounds like perhaps you're creating an object which has an open file handle for output for each script that's run. When the script finishes, if that object still exists, it will keep a file handle open and eventually you'll hit the system limit on open file handles for one process. It's also possible that your C++ app is the one which is failing to close file handles created for running the scripts - there's no easy way to tell from the information posted. You need to examine carefully what happens to any stdin/stdout/stderr files which are created to execute scripts and ensure that they are all properly closed (or, in the case of Python, if you don't explicitly close them, that any references to the files cease to exist after the script runs). I'd personally recommend explicit closing here. -- Jim Segrave ([EMAIL PROTECTED]) -- http://mail.python.org/mailman/listinfo/python-list