#34794 [Com]: proc_close() hangs when used with two processes
ID: 34794 Comment by: jdolecek at NetBSD dot org Reported By: e-t172 at e-t172 dot net Status: Assigned Bug Type: Program Execution Operating System: Linux PHP Version: 5CVS-2005-10-09 (snap) Assigned To: wez New Comment: This is actually not a bug, or just documentation bug. proc_close() blocks if the child has not terminated yet. You must use proc_terminate() instead of proc_close() if you can't guarantee the child has already exited, or use proc_get_status() to check if the child has already exited if you want to avoid blocking. Previous Comments: [2006-04-29 02:42:16] Dallas at ekkySoftware dot com I have a similar problem, it seems at I can't concurrently call the same page with proc_open until the first proc_open returns. It looks like proc_open is running through a critical section even though it is opening separate processes. From experience it is like php is using the same pipes for each proc_open and can't continue to the next proc_open until the original as ended. I would normally use temporary files instead of pipes but this makes life difficult. Dallas http://www.ekkySoftware.com/ [2006-04-26 14:50:23] radu dot rendec at ines dot ro Same problem with 5.1.3RC4-dev (latest CVS snapshot at the moment) on Linux/i386. I independently reproduced the bug with the following piece of code: error_reporting(E_ALL); $spec = array( 0 = array(pipe, r), 1 = array(pipe, w), 2 = array(pipe, w) ); $p1 = proc_open(/bin/cat, $spec, $fd1); $p2 = proc_open(/bin/cat, $spec, $fd2); fclose($fd1[0]); fclose($fd1[1]); fclose($fd1[2]); echo closing p1... ; flush(); proc_close($p1); echo success\n; flush(); This code hangs in proc_close(). This doesn't happen if the second proc_open() is commented. Although the parent process seems to correctly close all pipes, the child process still remains blocked in read(0,...). [2005-11-01 11:54:27] [EMAIL PROTECTED] Assigned to the author of this stuff. [2005-10-09 20:12:09] e-t172 at e-t172 dot net Same problem with the last CVS Snapshot. [2005-10-09 18:23:40] e-t172 at e-t172 dot net Description: (i am french, sorry for my bad english) 1. Open two processes with proc_open() 2. Try to close them : it works only if you close the second one first, otherwise it hangs Reproduce code: --- ?php echo('Opening process 1'.\n); $process1 = proc_open('cat', array(0 = array('pipe', 'r'), 1 = array('pipe', 'r')), $pipes1); echo('Opening process 2'.\n); $process2 = proc_open('cat', array(0 = array('pipe', 'r'), 1 = array('pipe', 'r')), $pipes2); // WORKS : //echo('Closing process 2'.\n); //fclose($pipes2[0]); fclose($pipes2[1]); proc_close($process2); //echo('Closing process 1'.\n); //fclose($pipes1[0]); fclose($pipes1[1]); proc_close($process1); // DOESN'T WORK : echo('Closing process 1'.\n); fclose($pipes1[0]); fclose($pipes1[1]); proc_close($process1); echo('Closing process 2'.\n); fclose($pipes2[0]); fclose($pipes2[1]); proc_close($process2); ? Expected result: $ php -f test.php Opening process 1 Opening process 2 Closing process 1 Closing process 2 $ Actual result: -- $ php -f test.php Opening process 1 Opening process 2 Closing process 1 (HANGS) -- Edit this bug report at http://bugs.php.net/?id=34794edit=1
#34794 [Com]: proc_close() hangs when used with two processes
ID: 34794 Comment by: jdolecek at NetBSD dot org Reported By: e-t172 at e-t172 dot net Status: Assigned Bug Type: Program Execution Operating System: Linux PHP Version: 5CVS-2005-10-09 (snap) Assigned To: wez New Comment: Actually yes, there is a severe pipe setup problem. The problem is that the second spawned process inherits the descriptor of parent end of pipe to the first spawned process, created when setting up the process1 pipes. Since PHP doesn't set the close-on-exec flag, the descriptor stays open in process2. So, when the parent-end of pipes1[0] is closed in master script, the descriptor is still open by process2, thus the pipe write end is not closed yet and thus cat in process1 doesn't end. Note this is also potential security problem, since the second process is able to send data to the first. Fix: --- ext/standard/proc_open.c.orig 2006-05-28 19:10:35.0 +0200 +++ ext/standard/proc_open.c @@ -929,6 +929,16 @@ PHP_FUNCTION(proc_open) descriptors[i].mode_flags), mode_string, NULL); #else stream = php_stream_fopen_from_fd(descriptors[i].parentend, mode_string, NULL); + +#if defined(F_SETFD) defined(FD_CLOEXEC) + /* +* Mark the descriptor close-on-exec, so that it +* it won't be inherited by potential other children. +* This avoids first child deadlock on proc_close(). +*/ + fcntl(descriptors[i].parentend, F_SETFD, FD_CLOEXEC); +#endif + #endif if (stream) { zval *retfp; Previous Comments: [2006-05-28 17:32:37] jdolecek at NetBSD dot org This is actually not a bug, or just documentation bug. proc_close() blocks if the child has not terminated yet. You must use proc_terminate() instead of proc_close() if you can't guarantee the child has already exited, or use proc_get_status() to check if the child has already exited if you want to avoid blocking. [2006-04-29 02:42:16] Dallas at ekkySoftware dot com I have a similar problem, it seems at I can't concurrently call the same page with proc_open until the first proc_open returns. It looks like proc_open is running through a critical section even though it is opening separate processes. From experience it is like php is using the same pipes for each proc_open and can't continue to the next proc_open until the original as ended. I would normally use temporary files instead of pipes but this makes life difficult. Dallas http://www.ekkySoftware.com/ [2006-04-26 14:50:23] radu dot rendec at ines dot ro Same problem with 5.1.3RC4-dev (latest CVS snapshot at the moment) on Linux/i386. I independently reproduced the bug with the following piece of code: error_reporting(E_ALL); $spec = array( 0 = array(pipe, r), 1 = array(pipe, w), 2 = array(pipe, w) ); $p1 = proc_open(/bin/cat, $spec, $fd1); $p2 = proc_open(/bin/cat, $spec, $fd2); fclose($fd1[0]); fclose($fd1[1]); fclose($fd1[2]); echo closing p1... ; flush(); proc_close($p1); echo success\n; flush(); This code hangs in proc_close(). This doesn't happen if the second proc_open() is commented. Although the parent process seems to correctly close all pipes, the child process still remains blocked in read(0,...). [2005-11-01 11:54:27] [EMAIL PROTECTED] Assigned to the author of this stuff. [2005-10-09 20:12:09] e-t172 at e-t172 dot net Same problem with the last CVS Snapshot. The remainder of the comments for this report are too long. To view the rest of the comments, please view the bug report online at http://bugs.php.net/34794 -- Edit this bug report at http://bugs.php.net/?id=34794edit=1
#34794 [Com]: proc_close() hangs when used with two processes
ID: 34794 Comment by: Dallas at ekkySoftware dot com Reported By: e-t172 at e-t172 dot net Status: Assigned Bug Type: Program Execution Operating System: Linux PHP Version: 5CVS-2005-10-09 (snap) Assigned To: wez New Comment: I have a similar problem, it seems at I can't concurrently call the same page with proc_open until the first proc_open returns. It looks like proc_open is running through a critical section even though it is opening separate processes. From experience it is like php is using the same pipes for each proc_open and can't continue to the next proc_open until the original as ended. I would normally use temporary files instead of pipes but this makes life difficult. Dallas http://www.ekkySoftware.com/ Previous Comments: [2006-04-26 14:50:23] radu dot rendec at ines dot ro Same problem with 5.1.3RC4-dev (latest CVS snapshot at the moment) on Linux/i386. I independently reproduced the bug with the following piece of code: error_reporting(E_ALL); $spec = array( 0 = array(pipe, r), 1 = array(pipe, w), 2 = array(pipe, w) ); $p1 = proc_open(/bin/cat, $spec, $fd1); $p2 = proc_open(/bin/cat, $spec, $fd2); fclose($fd1[0]); fclose($fd1[1]); fclose($fd1[2]); echo closing p1... ; flush(); proc_close($p1); echo success\n; flush(); This code hangs in proc_close(). This doesn't happen if the second proc_open() is commented. Although the parent process seems to correctly close all pipes, the child process still remains blocked in read(0,...). [2005-11-01 11:54:27] [EMAIL PROTECTED] Assigned to the author of this stuff. [2005-10-09 20:12:09] e-t172 at e-t172 dot net Same problem with the last CVS Snapshot. [2005-10-09 18:23:40] e-t172 at e-t172 dot net Description: (i am french, sorry for my bad english) 1. Open two processes with proc_open() 2. Try to close them : it works only if you close the second one first, otherwise it hangs Reproduce code: --- ?php echo('Opening process 1'.\n); $process1 = proc_open('cat', array(0 = array('pipe', 'r'), 1 = array('pipe', 'r')), $pipes1); echo('Opening process 2'.\n); $process2 = proc_open('cat', array(0 = array('pipe', 'r'), 1 = array('pipe', 'r')), $pipes2); // WORKS : //echo('Closing process 2'.\n); //fclose($pipes2[0]); fclose($pipes2[1]); proc_close($process2); //echo('Closing process 1'.\n); //fclose($pipes1[0]); fclose($pipes1[1]); proc_close($process1); // DOESN'T WORK : echo('Closing process 1'.\n); fclose($pipes1[0]); fclose($pipes1[1]); proc_close($process1); echo('Closing process 2'.\n); fclose($pipes2[0]); fclose($pipes2[1]); proc_close($process2); ? Expected result: $ php -f test.php Opening process 1 Opening process 2 Closing process 1 Closing process 2 $ Actual result: -- $ php -f test.php Opening process 1 Opening process 2 Closing process 1 (HANGS) -- Edit this bug report at http://bugs.php.net/?id=34794edit=1
#34794 [Com]: proc_close() hangs when used with two processes
ID: 34794 Comment by: radu dot rendec at ines dot ro Reported By: e-t172 at e-t172 dot net Status: Assigned Bug Type: Program Execution Operating System: Linux PHP Version: 5CVS-2005-10-09 (snap) Assigned To: wez New Comment: Same problem with 5.1.3RC4-dev (latest CVS snapshot at the moment) on Linux/i386. I independently reproduced the bug with the following piece of code: error_reporting(E_ALL); $spec = array( 0 = array(pipe, r), 1 = array(pipe, w), 2 = array(pipe, w) ); $p1 = proc_open(/bin/cat, $spec, $fd1); $p2 = proc_open(/bin/cat, $spec, $fd2); fclose($fd1[0]); fclose($fd1[1]); fclose($fd1[2]); echo closing p1... ; flush(); proc_close($p1); echo success\n; flush(); This code hangs in proc_close(). This doesn't happen if the second proc_open() is commented. Although the parent process seems to correctly close all pipes, the child process still remains blocked in read(0,...). Previous Comments: [2005-11-01 11:54:27] [EMAIL PROTECTED] Assigned to the author of this stuff. [2005-10-09 20:12:09] e-t172 at e-t172 dot net Same problem with the last CVS Snapshot. [2005-10-09 18:23:40] e-t172 at e-t172 dot net Description: (i am french, sorry for my bad english) 1. Open two processes with proc_open() 2. Try to close them : it works only if you close the second one first, otherwise it hangs Reproduce code: --- ?php echo('Opening process 1'.\n); $process1 = proc_open('cat', array(0 = array('pipe', 'r'), 1 = array('pipe', 'r')), $pipes1); echo('Opening process 2'.\n); $process2 = proc_open('cat', array(0 = array('pipe', 'r'), 1 = array('pipe', 'r')), $pipes2); // WORKS : //echo('Closing process 2'.\n); //fclose($pipes2[0]); fclose($pipes2[1]); proc_close($process2); //echo('Closing process 1'.\n); //fclose($pipes1[0]); fclose($pipes1[1]); proc_close($process1); // DOESN'T WORK : echo('Closing process 1'.\n); fclose($pipes1[0]); fclose($pipes1[1]); proc_close($process1); echo('Closing process 2'.\n); fclose($pipes2[0]); fclose($pipes2[1]); proc_close($process2); ? Expected result: $ php -f test.php Opening process 1 Opening process 2 Closing process 1 Closing process 2 $ Actual result: -- $ php -f test.php Opening process 1 Opening process 2 Closing process 1 (HANGS) -- Edit this bug report at http://bugs.php.net/?id=34794edit=1