Couldn't a specific stream registry, and possibly some slight
refactoring make it easier to implement a way to make multiple
requests in parallel [1] [2]? I worked on changing Zend_Http_Client
to support something like this on several occasions, but each one had
a set of issues and was ugly.
1: example 1 -> http://us2.php.net/manual/en/function.curl-multi-exec.php
2: http://netevil.org/blog/2005/may/guru-multiplexing
On Apr 17, 2008, at 11:52 PM, Quintin Russ wrote:
Hi Guys,
This is just a quick email for those using the Rest Client / their
own Http
client and are seeing memory leaks in long running processes.
On line 115 of Zend/Http/Client/Adapter/Socket.php:
// Now, if we are not connected, connect
if (! is_resource($this->socket) || ! $this-
>config['keepalive']) {
$context = stream_context_create();
if ($secure) {
For each request a stream_context_create call is made. This returns a
resource which is not released until the process finishes as is
detailed in
this bug report here: http://bugs.php.net/bug.php?id=40257
This means for long running processes - Every time you make a new
request a
resource is returned, and not released when the socket is closed.
Over time
your memory leaks and "Bad Things" start happening.
We came up with a PoC fix by storing this context resource in the
Zend_Registry as follows:
if (Zend_Registry::isRegistered('zf_rest_context'))
{
$context = Zend_Registry::get('zf_rest_context');
}
else
{
$context = stream_context_create();
Zend_Registry::set('zf_rest_context', $context);
}
Not the cleanest / best solution, but we were simply trying to plug
the
leak. :-)
Use of a common object (Singleton perhaps?) could perhaps fix this
bug.
Thanks for your time, any comments appreciated.
Best Regards,
Quintin
--
View this message in context:
http://www.nabble.com/Bug%3A-Zend_Http_Client_Adapter_Socket-not-freeing-memory-tp16760061p16760061.html
Sent from the Zend Framework mailing list archive at Nabble.com.