[PHP] Touch an entry of a zip archive.

2010-05-18 Thread Bastien Helders
Hello list,

I wanted to know, is it possible to change the modified time of a specific
entry in the ziparchive? Or is it possible to set the modified time of a
file when packing the file, so each file preserve its modified time?

Best Regards,
Bastien


[PHP] Beginner's question: How to run a PHP web application locally?

2010-04-08 Thread Bastien Helders
Hi List,

The other day, I read an article that mentioned about a tool that would
permit to simulate a web environment for PHP, so that testing could be made
before uploading the page on the server. Unfortunately, I don't seem to find
the article again.

So here am I with this question: What should I need to set my test
environment? I'm working on Windows, but I'm all ears for solution on Linux
systems.

Best Regards,
Bastien


Re: [PHP]Zip and text files generated are corrupted

2010-03-30 Thread Bastien Helders
I've come to realize something, but I'm not sure if I could be right:

Maybe the instructions are interrupted because there is a lack of virtual
memory. I mean is there not a limit to the memory the script can use? It
would explain why the script goes on, as when the instruction is
interrupted, all the memory taken by it is released.

I don't know if I was clear about what I wanted to say...

2010/3/29 Bastien Helders eldroskan...@gmail.com

 I'm not sure. What is the exact command you are using?

 I'll show the code for the two scenario, maybe it'll help. I've edited out
 the sensitive information though, but I kept the essence of how it works.

 1) Copy the previous file and make modification on it

 ?php
 //This is this command that got interrupted and thus create the unexpected
 end-of-archive
 //To note is that the $previous_patch is retrieved from another file server
 copy($previous_patch,$zipname);

 //I go up in the file system, so that build/patchname doesn't appear in the
 paths in the zip archive.
 chdir('build/'.$patchname);

 //Foreach new folder add it to the copied patch
 foreach($folders_added as $folder){
 $command = 'zip -gr ../../' . $zipname . '
 software/hfFolders/'.$folder.'/* 21';
 exec($command, $output, $status);
 //show output and status
 }
 //I go down again, as it is no more needed when deleting entry in a zip
 file
 chdir('../..');

 //Foreach folder to be removed, remove it
 foreach($folders_removed as $folder){
 $command = 'zip -d ' . $zipname . '
 software/hfFolders/'.$folder.'\* 21';
 exec($command, $output, $status);
 //show output and status
 }



 2)After all the needed files are gathered in a temporary folder, compress
 the all

 ?php
 //I go up in the file system, so that build/patchname doesn't appear in the
 paths in the zip archive.
 chdir('build/'.$patchname);
 $command = 'zip -r ../../' . $zipname . ' * 21';
 //This is the command that timeout in this case
 exec($command, $output, $status);
 //show output and status

 //Do the rest of the operations


 I wonder if the zipArchive route would be easier.

 That what I was using before, but it modifies the timestamp of the file
 that are already in the zip archive and I can't have that.


 According to the documentation, both Apache and IIS have similar
 timeout values ...
 
 Your web server can have other timeout configurations that may also
 interrupt PHP execution. Apache has a Timeout directive and IIS has a
 CGI timeout function. Both default to 300 seconds. See your web server
 documentation for specific details.
 (
 http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time
 )

 Yeah I found this config in the httpd-default.conf file of my apache
 installation, but as I determined using two consecutive call of microtime()
 that the interrupted instructions doesn't go farther as 200 seconds, I don't
 see it relevant... (and again after the instruction is interrupted, the
 script continue to run.)


 Can you run the command from the shell directly without any problems.
 And run it repeatedly.

 I take that the equivalent of the php copy() function is the Windows copy
 command line.
 In this case, both copy on the big archive and zip -r on a big gathering of
 folder are running in the shell without any problem and repeatedly.


 2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 15:20, Bastien Helders eldroskan...@gmail.com wrote:
   I have checked the rights on the file for the first scenario and no
 user as
  locked it, I can see it, read it and write into it. I could even delete
 it
  if I wanted.
 
  For the second scenario, it doesn't even apply, as the exec('zip') that
  timeout try to create a new file (naturally in a folder where the web
 app
  user has all the necessary rights)
 
  In both case, it is no PHP timeout, as after the copy() in the first
  scenario, and the exec('zip') in the second scenario, the script
 continue to
  execute the other instructions, although the manipulation of the big
 files
  fails.
 
  But if it is not a PHP timeout, what is it?
 
  2010/3/26 Richard Quadling rquadl...@googlemail.com
 
  On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com
 wrote:
   I already used error_reporting and set_time_limit and the use of
   ini_set('display_errors', 1); didn't display more exceptions.
  
   However the modification in the exec helped display STDERR I think.
  
   1) In the first scenario we have the following:
  
   STDERR
   zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
  
   zip error: Internal logic error (write error on zip file)
   /STDERR
  
   The funny thing is, that now it is throwing status 5: a severe error
 in
   the
   zipfile format was
   detected. Processing probably failed imme­diately. Why It throw a
   status 5
   instead of a status 14, I can't say.
  
   So that's using 'zip -gr', when I stop using the option g and then
 call
   exec('zip -r

Re: [PHP]Zip and text files generated are corrupted

2010-03-29 Thread Bastien Helders
I'm not sure. What is the exact command you are using?

I'll show the code for the two scenario, maybe it'll help. I've edited out
the sensitive information though, but I kept the essence of how it works.

1) Copy the previous file and make modification on it

?php
//This is this command that got interrupted and thus create the unexpected
end-of-archive
//To note is that the $previous_patch is retrieved from another file server
copy($previous_patch,$zipname);

//I go up in the file system, so that build/patchname doesn't appear in the
paths in the zip archive.
chdir('build/'.$patchname);

//Foreach new folder add it to the copied patch
foreach($folders_added as $folder){
$command = 'zip -gr ../../' . $zipname . '
software/hfFolders/'.$folder.'/* 21';
exec($command, $output, $status);
//show output and status
}
//I go down again, as it is no more needed when deleting entry in a zip file
chdir('../..');

//Foreach folder to be removed, remove it
foreach($folders_removed as $folder){
$command = 'zip -d ' . $zipname . ' software/hfFolders/'.$folder.'\*
21';
exec($command, $output, $status);
//show output and status
}



2)After all the needed files are gathered in a temporary folder, compress
the all

?php
//I go up in the file system, so that build/patchname doesn't appear in the
paths in the zip archive.
chdir('build/'.$patchname);
$command = 'zip -r ../../' . $zipname . ' * 21';
//This is the command that timeout in this case
exec($command, $output, $status);
//show output and status

//Do the rest of the operations

I wonder if the zipArchive route would be easier.

That what I was using before, but it modifies the timestamp of the file that
are already in the zip archive and I can't have that.

According to the documentation, both Apache and IIS have similar
timeout values ...

Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
(
http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time)

Yeah I found this config in the httpd-default.conf file of my apache
installation, but as I determined using two consecutive call of microtime()
that the interrupted instructions doesn't go farther as 200 seconds, I don't
see it relevant... (and again after the instruction is interrupted, the
script continue to run.)

Can you run the command from the shell directly without any problems.
And run it repeatedly.

I take that the equivalent of the php copy() function is the Windows copy
command line.
In this case, both copy on the big archive and zip -r on a big gathering of
folder are running in the shell without any problem and repeatedly.

2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 15:20, Bastien Helders eldroskan...@gmail.com wrote:
   I have checked the rights on the file for the first scenario and no user
 as
  locked it, I can see it, read it and write into it. I could even delete
 it
  if I wanted.
 
  For the second scenario, it doesn't even apply, as the exec('zip') that
  timeout try to create a new file (naturally in a folder where the web app
  user has all the necessary rights)
 
  In both case, it is no PHP timeout, as after the copy() in the first
  scenario, and the exec('zip') in the second scenario, the script continue
 to
  execute the other instructions, although the manipulation of the big
 files
  fails.
 
  But if it is not a PHP timeout, what is it?
 
  2010/3/26 Richard Quadling rquadl...@googlemail.com
 
  On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com wrote:
   I already used error_reporting and set_time_limit and the use of
   ini_set('display_errors', 1); didn't display more exceptions.
  
   However the modification in the exec helped display STDERR I think.
  
   1) In the first scenario we have the following:
  
   STDERR
   zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
  
   zip error: Internal logic error (write error on zip file)
   /STDERR
  
   The funny thing is, that now it is throwing status 5: a severe error
 in
   the
   zipfile format was
   detected. Processing probably failed imme­diately. Why It throw a
   status 5
   instead of a status 14, I can't say.
  
   So that's using 'zip -gr', when I stop using the option g and then
 call
   exec('zip -r ...'), then I only get:
  
   STDERR
   zip error: Internal logic error (write error on zip file)
   /STDERR
  
   2) The error messages of the second scenario doesn't surprise me much:
  
   STDERR
   zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
   /STDERR
  
   Which was already known, as the call of copy() on the old patch P14
 crop
   it
   and thus prevent any operation to be done on it.
 
  So, the error is in the execution of the exec.
 
  Can you run the exec twice but to 2 different

Re: [PHP] Re: optimizing PHP for microseconds

2010-03-29 Thread Bastien Helders
I have a question as a relatively novice PHP developper.

Let's say you have this Intranet web application, that deals with the
generation of file bundles that could become quite large (let say in the 800
MB) after some kind of selection process. It should be available to many
users on this Intranet, but shouldn't require any installation. Would it be
a case where optimizing for microseconds would be recommended? Or would PHP
not be the language of choice?

I'm not asking to prove that there could be corner case where it could be
useful, but I am genuinely interested as I am in the development of such a
project, and increasing the performance of this web application is one of my
goal.

2010/3/28 Nathan Rixham nrix...@gmail.com

 mngghh, okay, consider me baited.

 Daevid Vincent wrote:
  Per Jessen wrote:
  Tommy Pham wrote:
 
  (I remember a list member, not mentioning his name, does optimization
  of PHP coding for just microseconds.  Do you think how much more he'd
  benefit from this?)
  Anyone who optimizes PHP for microseconds has lost touch with reality -
  or at least forgotten that he or she is using an interpreted language.
  But sometimes it's just plain fun to do it here on the list with
  everyone further optimizing the last optimized snippet :)
 
  Cheers,
  Rob.
 
  Was that someone me? I do that. And if you don't, then you're the kind of
  person I would not hire (not saying that to sound mean). I use single
  quotes instead of double where applicable. I use -- instead of ++. I use
  $boolean = !$boolean to alternate (instead of mod() or other incrementing
  solutions). I use LIMIT 1 on select, update, delete where appropriate.
 I
  use the session to cache the user and even query results. I don't use
  bloated frameworks (like Symfony or Zend or Cake or whatever else tries
 to
  be one-size-fits-all). The list goes on.

 That's not optimization, at best it's just an awareness of PHP syntax
 and a vague awareness of how the syntax will ultimately be interpreted.

 Using LIMIT 1 is not optimizing it's just saying you only want one
 result returned, the SQL query could still take five hours to run if no
 indexes, a poorly normalised database, wrong datatypes, and joins all
 over the place.

 Using the session to cache the user is the only thing that comes
 anywhere near to application optimisation in all you've said; and
 frankly I would take to be pretty obvious and basic stuff (yet pointless
 in most scenario's where you have to cater for possible bans and
 de-authorisations) - storing query results in a session cache is only
 ever useful in one distinct scenario, when the results of that query are
 only valid for the owner of the session, and only for the duration of
 that session, nothing more, nothing less. This is a one in a million
 scenario.

 Bloated frameworks, most of the time they are not bloated, especially
 when you use them properly and only include what you need on a need to
 use basis; then the big framework can only be considered a class or two.
 Sure the codebase seems more bloated, but at runtime it's easily
 negated. You can use these frameworks for any size project, enterprise
 included, provided you appreciated the strengths and weaknesses of the
 full tech stack at your disposal. Further, especially on enterprise
 projects it makes sense to drop development time by using a common
 framework, and far more importantly, to have a code base developers know
 well and can hit the ground running with.

 Generally unless you have unlimited learning time and practically zero
 budget constraints frameworks like the ones you mentioned should always
 be used for large team enterprise applications, although perhaps
 something more modular like Zend is suited. They also cover your own
 back when you are the lead developer, because on the day when a more
 experienced developer than yourself joins the project and points out all
 your mistakes, you're going to feel pretty shite and odds are very high
 that the project will go sour, get fully re-written or you'll have to
 leave due to stress (of being wrong).

  I would counter and say that if you are NOT optimizing every little drop
 of
  performance from your scripts, then you're either not running a site
  sufficiently large enough to matter, or you're doing your customers a
  disservice.

 Or you have no grasp of the tech stack available and certainly aren't
 utilizing it properly; I'm not suggesting that knowing how to use your
 language of choice well is a bad thing, it's great; knock yourself out.
 However, suggesting that optimising a php script for microseconds will
 boost performance in large sites (nay, any site) shows such a loss of
 focus that it's hard to comprehend.

 By also considering other posts from yourself (in reply to this and
 other threads) I can firmly say the above is true of you.

 Optimisation comes down to running the least amount of code possible,
 and only when really needed. If you are running a script 

Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Bastien Helders
I've already specified the outputs, and it doesn't change if I put it in a
file.

1)In the first scenario, where all the data are compressed together, the
only call of exec('zip') give this output:

OUTPUT
adding: bin/ (stored 0%)
adding: bin/startHotFixInstaller.bat (deflated 41%)
adding: bin/startHotFixInstaller.sh (deflated 49%)
adding: software/ (stored 0%)
adding: software/hotfixes/ (stored 0%)
adding: software/hotfixes/hfFolder/ (stored 0%)
[snip]
adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/lib/julia.jar
(deflated 4%)
adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/software/
/OUTPUT

I snipped the output because it is a lot of the same, but, you'll notice
that in the last line, the status of the file between parenthesis is
missing, which leads me to think it has been interrupted.

I've made a few research in between.Of note, the status with which he
exited. Status 14 for the zip command means error writing to a file. But
it isn't always at the same files. Also, I upped the value of
max_input_time in php.ini from 60 to 600. Before the change the exec
instructions took about 60 seconds before interrupting, after it takes about
180-200 seconds and not 600 as expected.

2)In the second scenario, as said, I copy the previous patch (P14, which
itself is a behemoth of a zip archive that was manually assembled) and then
add and delete only a few folders, each calling the function exec('zip...').
Each time it ends with status 2, which means unexpected end of zip files.

And there is no output to each of those commands.

As for the single exec('zip..') in 1), the copy() of the previous patch took
about 60 seconds before the php.ini change and about 180-200 seconds after.
I take it that the copy() is interrupted thus explaining the unexpected end
of zip files (I can open the original patch P14 without any problem).

I hope I made myself more clear on the details of my problem.

Best Regards,
Bastien


2010/3/25 Richard Quadling rquadl...@googlemail.com

 On 25 March 2010 13:31, Bastien Helders eldroskan...@gmail.com wrote:
  I'm really stumped, it seems that although the script is running under
 the
  time limit, if a single instruction such as exec(zip) in the first
 case,
  or copy() in the second case are timing out, because it takes too much
 time
  processing the big file.
 
  Is there any configuration in php.ini (or anywhere else) that I could
 change
  to permit copy() or exec(zip) to run through without being interrupted?
 
  Regards,
  Bastien
 

 What is the output of the exec when the command fails?

 Not the return value of exec() which is the last line, but the whole
 thing, which is returned in the second parameter.

 If you can't see it due to pushing the file as part of the script,
 then try something like ...


 exec('zip ', $Output);
 file_put_contents('./ZipResults.txt', $Output);



 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Bastien Helders
I already used error_reporting and set_time_limit and the use of
ini_set('display_errors', 1); didn't display more exceptions.

However the modification in the exec helped display STDERR I think.

1) In the first scenario we have the following:

STDERR
zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty

zip error: Internal logic error (write error on zip file)
/STDERR

The funny thing is, that now it is throwing status 5: a severe error in the
zipfile format was
detected. Processing probably failed imme­diately. Why It throw a status 5
instead of a status 14, I can't say.

So that's using 'zip -gr', when I stop using the option g and then call
exec('zip -r ...'), then I only get:

STDERR
zip error: Internal logic error (write error on zip file)
/STDERR

2) The error messages of the second scenario doesn't surprise me much:

STDERR
zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
/STDERR

Which was already known, as the call of copy() on the old patch P14 crop it
and thus prevent any operation to be done on it.

2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 08:51, Bastien Helders eldroskan...@gmail.com wrote:
  I've already specified the outputs, and it doesn't change if I put it in
 a
  file.
 
  1)In the first scenario, where all the data are compressed together, the
  only call of exec('zip') give this output:
 
  OUTPUT
  adding: bin/ (stored 0%)
  adding: bin/startHotFixInstaller.bat (deflated 41%)
  adding: bin/startHotFixInstaller.sh (deflated 49%)
  adding: software/ (stored 0%)
  adding: software/hotfixes/ (stored 0%)
  adding: software/hotfixes/hfFolder/ (stored 0%)
  [snip]
  adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/lib/julia.jar
  (deflated 4%)
  adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/software/
  /OUTPUT
 
  I snipped the output because it is a lot of the same, but, you'll notice
  that in the last line, the status of the file between parenthesis is
  missing, which leads me to think it has been interrupted.
 
  I've made a few research in between.Of note, the status with which he
  exited. Status 14 for the zip command means error writing to a file.
 But
  it isn't always at the same files. Also, I upped the value of
  max_input_time in php.ini from 60 to 600. Before the change the exec
  instructions took about 60 seconds before interrupting, after it takes
 about
  180-200 seconds and not 600 as expected.
 
  2)In the second scenario, as said, I copy the previous patch (P14, which
  itself is a behemoth of a zip archive that was manually assembled) and
 then
  add and delete only a few folders, each calling the function
 exec('zip...').
  Each time it ends with status 2, which means unexpected end of zip
 files.
 
  And there is no output to each of those commands.
 
  As for the single exec('zip..') in 1), the copy() of the previous patch
 took
  about 60 seconds before the php.ini change and about 180-200 seconds
 after.
  I take it that the copy() is interrupted thus explaining the unexpected
 end
  of zip files (I can open the original patch P14 without any problem).
 
  I hope I made myself more clear on the details of my problem.
 
  Best Regards,
  Bastien
 
 
  2010/3/25 Richard Quadling rquadl...@googlemail.com
 
  On 25 March 2010 13:31, Bastien Helders eldroskan...@gmail.com wrote:
   I'm really stumped, it seems that although the script is running under
   the
   time limit, if a single instruction such as exec(zip) in the first
   case,
   or copy() in the second case are timing out, because it takes too much
   time
   processing the big file.
  
   Is there any configuration in php.ini (or anywhere else) that I could
   change
   to permit copy() or exec(zip) to run through without being
   interrupted?
  
   Regards,
   Bastien
  
 
  What is the output of the exec when the command fails?
 
  Not the return value of exec() which is the last line, but the whole
  thing, which is returned in the second parameter.
 
  If you can't see it due to pushing the file as part of the script,
  then try something like ...
 
 
  exec('zip ', $Output);
  file_put_contents('./ZipResults.txt', $Output);
 
 
 
  --
  -
  Richard Quadling
  Standing on the shoulders of some very clever giants!
  EE : http://www.experts-exchange.com/M_248814.html
  EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
  Zend Certified Engineer :
 http://zend.com/zce.php?c=ZEND002498r=213474731
  ZOPA : http://uk.zopa.com/member/RQuadling
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 

 I _think_ that the $Output will only hold STDOUT and not STDERR.

 Can you try this ...

 exec(zip  21, $Output);


 Also,

 error_reporting(-1); // Show ALL errors/warnings/notices.
 ini_set('display_errors', 1); // Display them.
 set_time_limit(0); // Allow run forever


 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts

Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Bastien Helders
 I have checked the rights on the file for the first scenario and no user as
locked it, I can see it, read it and write into it. I could even delete it
if I wanted.

For the second scenario, it doesn't even apply, as the exec('zip') that
timeout try to create a new file (naturally in a folder where the web app
user has all the necessary rights)

In both case, it is no PHP timeout, as after the copy() in the first
scenario, and the exec('zip') in the second scenario, the script continue to
execute the other instructions, although the manipulation of the big files
fails.

But if it is not a PHP timeout, what is it?

2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com wrote:
  I already used error_reporting and set_time_limit and the use of
  ini_set('display_errors', 1); didn't display more exceptions.
 
  However the modification in the exec helped display STDERR I think.
 
  1) In the first scenario we have the following:
 
  STDERR
  zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
 
  zip error: Internal logic error (write error on zip file)
  /STDERR
 
  The funny thing is, that now it is throwing status 5: a severe error in
 the
  zipfile format was
  detected. Processing probably failed imme­diately. Why It throw a status
 5
  instead of a status 14, I can't say.
 
  So that's using 'zip -gr', when I stop using the option g and then call
  exec('zip -r ...'), then I only get:
 
  STDERR
  zip error: Internal logic error (write error on zip file)
  /STDERR
 
  2) The error messages of the second scenario doesn't surprise me much:
 
  STDERR
  zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
  /STDERR
 
  Which was already known, as the call of copy() on the old patch P14 crop
 it
  and thus prevent any operation to be done on it.

 So, the error is in the execution of the exec.

 Can you run the exec twice but to 2 different zip files.

 If the issue is that PHP is timing out, then the first error COULD be
 due to the process being killed and if so, the second one won't start.

 But if the second one starts, then that pretty much rules out PHP timeouts.

 I assume you've checked disk space and read access to the files in
 question? i.e. they aren't locked by another user?


 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Bastien Helders
So I tested two scenario:

- First, I gather all the files selected for the patch and then compress
them together and here is what is displayed:

[Begin display]
The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a status of
14 and the following output:
adding: bin/ (stored 0%)
adding: bin/startHotFixInstaller.bat (deflated 41%)
adding: bin/startHotFixInstaller.sh (deflated 49%)
adding: software/ (stored 0%)
adding: software/hotfixes/ (stored 0%)
[snip]

br

bWarning/b:
rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15/Patch-6-3-2_Q3P15.zip)
[function.rename]: No such file or directory
[End display]

I know that the rename didn't work, while the zip command aborted and
generated no zip file.
There is no problem with the README text file.

- Second scenario, I take the previous patch, compare the list of folders in
the previous patch with list of selected folder, add the folders not in the
previous patch and eventually remove folders that weren't selected but were
in the previous patch.

In this case, all the commands, may it be of the type
zip -gr ../../build/Patch-6-3-2_Q3P15.zip
software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
build/Patch-6-3-2_Q3P15.zip software/hotfixes/hfFolder/HF-632Q3-127\* to
delete an unwanted folder returns all with status 2 and no output.

2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
  Hi Ashley,
 
  No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);),
 and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the php.ini
  file to a higher value help? Actually it is set to 128M. But I actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders (with
 content)
  want to create a zip containing them as well as creating a text file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and text
 file
  without any problem, but as soon as the number of selected folder has an
  high value (let's say about 150 of them), I've got problems with the
  generated files: The zip archive doesn't contain all the folders and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the next
  operation and close the streams uncleanly. But I can't be sure about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

 sort of thing.

 The error may be in the zip.
 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Bastien Helders
Forgot to say, it is the second scenario that generate corrupted zip and
text files with unexpected end of files.

2010/3/25 Bastien Helders eldroskan...@gmail.com

 So I tested two scenario:

 - First, I gather all the files selected for the patch and then compress
 them together and here is what is displayed:

 [Begin display]
 The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a status
 of 14 and the following output:
 adding: bin/ (stored 0%)
 adding: bin/startHotFixInstaller.bat (deflated 41%)
 adding: bin/startHotFixInstaller.sh (deflated 49%)
 adding: software/ (stored 0%)
 adding: software/hotfixes/ (stored 0%)
 [snip]

 br

 bWarning/b:
 rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15/Patch-6-3-2_Q3P15.zip)
 [function.rename]: No such file or directory
 [End display]

 I know that the rename didn't work, while the zip command aborted and
 generated no zip file.
 There is no problem with the README text file.

 - Second scenario, I take the previous patch, compare the list of folders
 in the previous patch with list of selected folder, add the folders not in
 the previous patch and eventually remove folders that weren't selected but
 were in the previous patch.

 In this case, all the commands, may it be of the type
 zip -gr ../../build/Patch-6-3-2_Q3P15.zip
 software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
 build/Patch-6-3-2_Q3P15.zip software/hotfixes/hfFolder/HF-632Q3-127\* to
 delete an unwanted folder returns all with status 2 and no output.

 2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
  Hi Ashley,
 
  No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);),
 and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the
 php.ini
  file to a higher value help? Actually it is set to 128M. But I actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders (with
 content)
  want to create a zip containing them as well as creating a text file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and text
 file
  without any problem, but as soon as the number of selected folder has
 an
  high value (let's say about 150 of them), I've got problems with the
  generated files: The zip archive doesn't contain all the folders and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the next
  operation and close the streams uncleanly. But I can't be sure about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

 sort of thing.

 The error may be in the zip.
 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer :
 http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




 --
 haXe - an open source web programming language
 http://haxe.org




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Bastien Helders
I'm really stumped, it seems that although the script is running under the
time limit, if a single instruction such as exec(zip) in the first case,
or copy() in the second case are timing out, because it takes too much time
processing the big file.

Is there any configuration in php.ini (or anywhere else) that I could change
to permit copy() or exec(zip) to run through without being interrupted?

Regards,
Bastien

2010/3/25 Bastien Helders eldroskan...@gmail.com

 Forgot to say, it is the second scenario that generate corrupted zip and
 text files with unexpected end of files.

 2010/3/25 Bastien Helders eldroskan...@gmail.com

 So I tested two scenario:

 - First, I gather all the files selected for the patch and then compress
 them together and here is what is displayed:

 [Begin display]
 The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a status
 of 14 and the following output:
 adding: bin/ (stored 0%)
 adding: bin/startHotFixInstaller.bat (deflated 41%)
 adding: bin/startHotFixInstaller.sh (deflated 49%)
 adding: software/ (stored 0%)
 adding: software/hotfixes/ (stored 0%)
 [snip]

 br

 bWarning/b:
 rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15/Patch-6-3-2_Q3P15.zip)
 [function.rename]: No such file or directory
 [End display]

 I know that the rename didn't work, while the zip command aborted and
 generated no zip file.
 There is no problem with the README text file.

 - Second scenario, I take the previous patch, compare the list of folders
 in the previous patch with list of selected folder, add the folders not in
 the previous patch and eventually remove folders that weren't selected but
 were in the previous patch.

 In this case, all the commands, may it be of the type
 zip -gr ../../build/Patch-6-3-2_Q3P15.zip
 software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
 build/Patch-6-3-2_Q3P15.zip software/hotfixes/hfFolder/HF-632Q3-127\* to
 delete an unwanted folder returns all with status 2 and no output.

 2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
  Hi Ashley,
 
  No, I set the time limit high enough
 (set_time_limit(2*HOUR+8*MINUTE);), and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the
 php.ini
  file to a higher value help? Actually it is set to 128M. But I actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders (with
 content)
  want to create a zip containing them as well as creating a text file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and text
 file
  without any problem, but as soon as the number of selected folder has
 an
  high value (let's say about 150 of them), I've got problems with the
  generated files: The zip archive doesn't contain all the folders and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the next
  operation and close the streams uncleanly. But I can't be sure about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

 sort of thing.

 The error may be in the zip.
 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer :
 http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




 --
 haXe - an open source web programming language
 http://haxe.org




 --
 haXe - an open source web programming language
 http://haxe.org




-- 
haXe - an open source web programming language
http://haxe.org


[PHP]Zip and text files generated are corrupted

2010-03-24 Thread Bastien Helders
Hi list,

I've got this web app, which from a list of selected folders (with content)
want to create a zip containing them as well as creating a text file with
information about the chosen folders and how to use them.

To create the zip file I use exec('zip -gr ' .$zipname.' *  mylog.log');
in the temporary folder where I gathered all the data (using a zipArchive
object was more time consuming). I then create the text file using fopen,
many fwrites and a fclose.

My problem is the following, normally it creates the archive and text file
without any problem, but as soon as the number of selected folder has an
high value (let's say about 150 of them), I've got problems with the
generated files: The zip archive doesn't contain all the folders and there
is an unexpected end of file on both zip and text files.

My guess is, as it takes too much time, the script goes on to the next
operation and close the streams uncleanly. But I can't be sure about that,
and I don't know where to investigate.

Regards,
Bastien


Re: [PHP]Zip and text files generated are corrupted

2010-03-24 Thread Bastien Helders
Hi Ashley,

No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);), and
the execution stops a long time before the time limit is reached.

It might be relevent that the web application is hosted on a Windows
Machine.

I asked myself, would setting the parameter memory_limit of the php.ini
file to a higher value help? Actually it is set to 128M. But I actually
don't have problems with creating a zip archive of about 250M (~80 folders),
it actually occurs with 3 times bigger archives.

Best Regards,
Bastien

2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk

  On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:

 Hi list,

 I've got this web app, which from a list of selected folders (with content)
 want to create a zip containing them as well as creating a text file with
 information about the chosen folders and how to use them.

 To create the zip file I use exec('zip -gr ' .$zipname.' *  mylog.log');
 in the temporary folder where I gathered all the data (using a zipArchive
 object was more time consuming). I then create the text file using fopen,
 many fwrites and a fclose.

 My problem is the following, normally it creates the archive and text file
 without any problem, but as soon as the number of selected folder has an
 high value (let's say about 150 of them), I've got problems with the
 generated files: The zip archive doesn't contain all the folders and there
 is an unexpected end of file on both zip and text files.

 My guess is, as it takes too much time, the script goes on to the next
 operation and close the streams uncleanly. But I can't be sure about that,
 and I don't know where to investigate.

 Regards,
 Bastien


 Is the script maybe running past the max_execution_time before the zip
 files are completed?


   Thanks,
 Ash
 http://www.ashleysheridan.co.uk





-- 
haXe - an open source web programming language
http://haxe.org


[PHP]Executing a .jar from a php script

2008-10-27 Thread Bastien Helders
Hi,

I would like to execute a jar file using exec('java -jar JARNAME option'),
but so far, my web application didn't gave me any hint that it did anything
(I tried to echo the result of the function, but nothing), and in fact I
don't think anything was done.
The jar file is in the same folder as the php scripts, so I don't know what
I did wrong.

Best Regards,
Bastien

-- 
haXe - an open source web programming language
http://haxe.org


[PHP]Keep the modification date of a file when archiving it.

2008-10-23 Thread Bastien Helders

Hi,

When I'm archiving files in a ZIP file, using the class ZipArchive, the 
modification date is modified to when ZipArchive::close is called. I would like 
to keep the original modification date. Is that even possible?

Best Regards,
Bastien Helders

_
Téléphonez gratuitement à tous vos proches avec Windows Live Messenger  !  
Téléchargez-le maintenant !
http://www.windowslive.fr/messenger/1.asp