I've come to realize something, but I'm not sure if I could be right:

Maybe the instructions are interrupted because there is a lack of virtual
memory. I mean is there not a limit to the memory the script can use? It
would explain why the script goes on, as when the instruction is
interrupted, all the memory taken by it is released.

I don't know if I was clear about what I wanted to say...

2010/3/29 Bastien Helders <eldroskan...@gmail.com>

> >I'm not sure. What is the exact command you are using?
>
> I'll show the code for the two scenario, maybe it'll help. I've edited out
> the sensitive information though, but I kept the essence of how it works.
>
> 1) Copy the previous file and make modification on it
>
> <?php
> //This is this command that got interrupted and thus create the unexpected
> end-of-archive
> //To note is that the $previous_patch is retrieved from another file server
> copy($previous_patch,$zipname);
>
> //I go up in the file system, so that build/patchname doesn't appear in the
> paths in the zip archive.
> chdir('build/'.$patchname);
>
> //Foreach new folder add it to the copied patch
> foreach($folders_added as $folder){
>         $command = 'zip -gr ../../' . $zipname . '
> software/hfFolders/'.$folder.'/* 2>&1';
>         exec($command, $output, $status);
>         //show output and status
> }
> //I go down again, as it is no more needed when deleting entry in a zip
> file
> chdir('../..');
>
> //Foreach folder to be removed, remove it
> foreach($folders_removed as $folder){
>         $command = 'zip -d ' . $zipname . '
> software/hfFolders/'.$folder.'\* 2>&1';
>         exec($command, $output, $status);
>         //show output and status
> }
>
>
>
> 2)After all the needed files are gathered in a temporary folder, compress
> the all
>
> <?php
> //I go up in the file system, so that build/patchname doesn't appear in the
> paths in the zip archive.
> chdir('build/'.$patchname);
> $command = 'zip -r ../../' . $zipname . ' * 2>&1';
> //This is the command that timeout in this case
> exec($command, $output, $status);
> //show output and status
>
> //Do the rest of the operations
>
>
> >I wonder if the zipArchive route would be easier.
>
> That what I was using before, but it modifies the timestamp of the file
> that are already in the zip archive and I can't have that.
>
>
> >According to the documentation, both Apache and IIS have similar
> >timeout values ...
> >
> >"Your web server can have other timeout configurations that may also
> >interrupt PHP execution. Apache has a Timeout directive and IIS has a
> >CGI timeout function. Both default to 300 seconds. See your web server
> >documentation for specific details."
> >(
> http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time
> )
>
> Yeah I found this config in the httpd-default.conf file of my apache
> installation, but as I determined using two consecutive call of microtime()
> that the interrupted instructions doesn't go farther as 200 seconds, I don't
> see it relevant... (and again after the instruction is interrupted, the
> script continue to run.)
>
>
> >Can you run the command from the shell directly without any problems.
> >And run it repeatedly.
>
> I take that the equivalent of the php copy() function is the Windows copy
> command line.
> In this case, both copy on the big archive and zip -r on a big gathering of
> folder are running in the shell without any problem and repeatedly.
>
>
> 2010/3/26 Richard Quadling <rquadl...@googlemail.com>
>
>> On 26 March 2010 15:20, Bastien Helders <eldroskan...@gmail.com> wrote:
>> >  I have checked the rights on the file for the first scenario and no
>> user as
>> > locked it, I can see it, read it and write into it. I could even delete
>> it
>> > if I wanted.
>> >
>> > For the second scenario, it doesn't even apply, as the exec('zip') that
>> > timeout try to create a new file (naturally in a folder where the web
>> app
>> > user has all the necessary rights)
>> >
>> > In both case, it is no PHP timeout, as after the copy() in the first
>> > scenario, and the exec('zip') in the second scenario, the script
>> continue to
>> > execute the other instructions, although the manipulation of the big
>> files
>> > fails.
>> >
>> > But if it is not a PHP timeout, what is it?
>> >
>> > 2010/3/26 Richard Quadling <rquadl...@googlemail.com>
>> >>
>> >> On 26 March 2010 12:21, Bastien Helders <eldroskan...@gmail.com>
>> wrote:
>> >> > I already used error_reporting and set_time_limit and the use of
>> >> > ini_set('display_errors', 1); didn't display more exceptions.
>> >> >
>> >> > However the modification in the exec helped display STDERR I think.
>> >> >
>> >> > 1) In the first scenario we have the following:
>> >> >
>> >> > <STDERR>
>> >> > zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
>> >> >
>> >> > zip error: Internal logic error (write error on zip file)
>> >> > </STDERR>
>> >> >
>> >> > The funny thing is, that now it is throwing status 5: "a severe error
>> in
>> >> > the
>> >> > zipfile format was
>> >> > detected. Processing probably failed imme­diately." Why It throw a
>> >> > status 5
>> >> > instead of a status 14, I can't say.
>> >> >
>> >> > So that's using 'zip -gr', when I stop using the option g and then
>> call
>> >> > exec('zip -r ...'), then I only get:
>> >> >
>> >> > <STDERR>
>> >> > zip error: Internal logic error (write error on zip file)
>> >> > </STDERR>
>> >> >
>> >> > 2) The error messages of the second scenario doesn't surprise me
>> much:
>> >> >
>> >> > <STDERR>
>> >> > zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
>> >> > </STDERR>
>> >> >
>> >> > Which was already known, as the call of copy() on the old patch P14
>> crop
>> >> > it
>> >> > and thus prevent any operation to be done on it.
>> >>
>> >> So, the error is in the execution of the exec.
>> >>
>> >> Can you run the exec twice but to 2 different zip files.
>> >>
>> >> If the issue is that PHP is timing out, then the first error COULD be
>> >> due to the process being killed and if so, the second one won't start.
>> >>
>> >> But if the second one starts, then that pretty much rules out PHP
>> >> timeouts.
>> >>
>> >> I assume you've checked disk space and read access to the files in
>> >> question? i.e. they aren't locked by another user?
>> >>
>> >>
>> >> --
>> >> -----
>> >> Richard Quadling
>> >> "Standing on the shoulders of some very clever giants!"
>> >> EE : http://www.experts-exchange.com/M_248814.html
>> >> EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
>> >> Zend Certified Engineer :
>> http://zend.com/zce.php?c=ZEND002498&r=213474731
>> >> ZOPA : http://uk.zopa.com/member/RQuadling
>> >
>> >
>> >
>> > --
>> > haXe - an open source web programming language
>> > http://haxe.org
>> >
>>
>> I'm not sure. What is the exact command you are using?
>>
>> I wonder if the zipArchive route would be easier.
>>
>>
>> According to the documentation, both Apache and IIS have similar
>> timeout values ...
>>
>> "Your web server can have other timeout configurations that may also
>> interrupt PHP execution. Apache has a Timeout directive and IIS has a
>> CGI timeout function. Both default to 300 seconds. See your web server
>> documentation for specific details."
>> (
>> http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time
>> )
>>
>> Can you run the command from the shell directly without any problems.
>> And run it repeatedly.
>>
>>
>> --
>> -----
>> Richard Quadling
>> "Standing on the shoulders of some very clever giants!"
>> EE : http://www.experts-exchange.com/M_248814.html
>> EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
>> Zend Certified Engineer :
>> http://zend.com/zce.php?c=ZEND002498&r=213474731
>> ZOPA : http://uk.zopa.com/member/RQuadling
>>
>
>
>
> --
> haXe - an open source web programming language
> http://haxe.org
>



-- 
haXe - an open source web programming language
http://haxe.org

Reply via email to