Re: [PHP] POST headers empty when using SSLProxyEngine
Hi Mirco, > I had a simmalar problem myself and didn't find a solutions. > As far as I fond out the ssl_proxy module simply does not route POST. It > just routes the link, wich means GET should work. thanks for your reply! That sounds logical, indeed. I have posted my help request to the Apache users mailing list as well, maybe someone there comes up with a solution or even a patch. > My solutions would be that I bypass the proxy for file transfer. How could I achieve that without losing my session variables? > I you find another solutions I would be glad to know how. Of course, I'll do! :) Thanks Florian -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] PHP as CGI: Denial of Service?
Hello there, PHP set up as CGI (either with binfmt and suEXEC or via suPHP) can expose your system to a denial of service attack. Even a very simple page like can bog down a server completely if the reload button on the browser is pressed continously for some seconds. I already tried the RMax directives in httpd.conf and the memory limit in php.ini, but it does not seem to work, it is just being ignored. I think that so many processes are spawned that the system is out of control. I can get my load as high as 91 and my disk swaps for nearly 30 minutes until it works again. Sometimes even the kernel crashed with out of memory errors. Apart from trying out cgiwrap, I am completely helpless right now. Does anyone have an idea on what to do? I can't be possible that every PHP suEXEC install is a big security risk. Any tips are welcome! I experienced this problem with Apache 1.3 and 2.0. Thanks in advance, Florian -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] PHP as CGI becomes a zombie when loaded too often
Sorry for posting this again and again, but I still experience this problem and there seems to be no way for me to solve it. I've got confirmation from others that this problem is not only mine, so please take the time to read this. I've tried Apache 1.3 and 2.0, both on Linux 2.4. I've tried using suEXEC and not using suEXEC, and I even tried modules that stop script execution at a specific load average (tested with 1.00!) or number of processes (tested with 10!). But nothing seems to help in the following case: I run PHP as CGI because I don't want to have world-readable scripts and mod_perchild is not ready yet. When I do a hard reload - i.e. reloading the same script for about 10 seconds continously which should open quite a lot of scripts - I can crash the server. PHP-CGI- processes become zombies, I get a load average of about 90 (!) and it can take up to 30 minutes until the system responds again. This happens even with the simplest PHP scripts like a phpinfo call, but Perl scripts make absolutely no problem. The PHP developers say it's an Apache problem, the Apache developers say it's a PHP problem. So *PLEASE* take the time to review this one again - I'm helpless right now! :-( I know there must be a solution, because some providers run PHP as CGI without problems, but I don't know what it could be. :-( Also see http://bugs.php.net/bug.php?id=28556&edit=1 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] limit number of CGI processes
Hi there, I run PHP as CGI (because of suEXEC), but some configuration must be wrong. I just tried out to reload a PHP generated website about 20 or 30 times in my browser, and this really bogged down the server, I had a load of about 20 or 30. Is there anything I can do to limit this risk? I already fiddled around with some configuration variables, but it didn't help. It always created a whole lot of CGI childs that used up all memory... I run Apache 2.0 on Linux 2.4. Thanks! Florian -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php