Hi Ashley,

No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);), and
the execution stops a long time before the time limit is reached.

It might be relevent that the web application is hosted on a Windows
Machine.

I asked myself, would setting the parameter "memory_limit" of the php.ini
file to a higher value help? Actually it is set to 128M. But I actually
don't have problems with creating a zip archive of about 250M (~80 folders),
it actually occurs with 3 times bigger archives.

Best Regards,
Bastien

2010/3/24 Ashley Sheridan <a...@ashleysheridan.co.uk>

>  On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
>
> Hi list,
>
> I've got this web app, which from a list of selected folders (with content)
> want to create a zip containing them as well as creating a text file with
> information about the chosen folders and how to use them.
>
> To create the zip file I use exec('zip -gr ' .$zipname.' * >> mylog.log');
> in the temporary folder where I gathered all the data (using a zipArchive
> object was more time consuming). I then create the text file using fopen,
> many fwrites and a fclose.
>
> My problem is the following, normally it creates the archive and text file
> without any problem, but as soon as the number of selected folder has an
> high value (let's say about 150 of them), I've got problems with the
> generated files: The zip archive doesn't contain all the folders and there
> is an unexpected end of file on both zip and text files.
>
> My guess is, as it takes too much time, the script goes on to the next
> operation and close the streams uncleanly. But I can't be sure about that,
> and I don't know where to investigate.
>
> Regards,
> Bastien
>
>
> Is the script maybe running past the max_execution_time before the zip
> files are completed?
>
>
>   Thanks,
> Ash
> http://www.ashleysheridan.co.uk
>
>
>


-- 
haXe - an open source web programming language
http://haxe.org

Reply via email to