Out of curiosity, are you setting the curl timeout?  By default, I
think it is 30 seconds.  Otherwise, the server configuration may be
set so that the TimeOut directive is set to 30 seconds.  There are a
few possible issues here, not necessarily just PHP alone.

On Jul 23, 11:04 pm, Brad C <[EMAIL PROTECTED]> wrote:
> The default value provided by 1and1 is 50000. I am able to change it
> using the init_set(), and phpinfo() does report the modified values.
> I've changed the value and it doesn't have an effect on the 30 second
> timeout I'm experiencing.
>
> I'll contact 1and1 and hope they can provide some insight into this.
> If I can't adjust this timeout, I'll just modify the program to break
> up the screen scraping into separate invocations.
>
> On Jul 23, 5:45 pm, francky06l <[EMAIL PROTECTED]> wrote:
>
>
>
> > Maybe your host does not allow you to change max_execution_time by
> > ini_set or whatever..
>
> > As a quick test, make a phpinfo() in a simple php file.
> > Make another php with init_set('max_execution_time', 3600); phpinfo();
> > check if the max_execution_time did change ..
>
> > However, try to change to lower value, if the phpinfo() gives you a
> > lower value you can try to call ini_set in your loop  with a lower
> > value ...(since everytime init_set(max_execution_time..) is called, it
> > resets the time script to 0 ) ... Just a hint, not sure about the
> > result..
> > hth.
>
> > On Jul 23, 9:54 pm, clemos <[EMAIL PROTECTED]> wrote:
>
> > > Hi Brad
>
> > > On Wed, Jul 23, 2008 at 8:22 PM, Brad C <[EMAIL PROTECTED]> wrote:
>
> > > > I thought about both solutions you mentioned, but was hoping to just
> > > > adjust the timeout. The script wouldn't take much longer than 60
> > > > seconds, unless there are timeouts to the webpages.
>
> > > > None of the responses to my original post have mentioned specifically
> > > > which value I can adjust to increase this timeout. The
> > > > 'max_execution_time' is set to 50000. Are there any other timeout
> > > > values for PHP scripts ?
>
> > > I don't think there are other timeout values (though there are other
> > > limits like memory usage, etc)
> > > Did you change "max_execution_time" or is 50000 the default value ?
>
> > > > I can override the PHP settings by using a php.ini file, or by calling
> > > > the ini_set() method in CakePHP before the framework is initialized.
> > > > But, none of those settings seemed to have an effect.
>
> > > > Is it possible that my hosting provider has capped this to 30 seconds
> > > > and won't allow me to override it ?
>
> > > Of course, it's possible; actually, it's like that on every shared
> > > host I've experienced so far.
>
> > > ++++++
> > > Clément
>
> > > > On Jul 23, 10:15 am, BrendonKoz <[EMAIL PROTECTED]> wrote:
> > > >> Others have already stated the PHP setting issue.  As many hosting
> > > >> providers limit access to that setting (for good reason), chances are
> > > >> that even if you were to modify it to 60 seconds, since you're on a
> > > >> shared host, it may take longer than you'd expect.  It'd be a better
> > > >> idea to loop the script either after a set number of execution
> > > >> seconds, or a set number of records.
>
> > > >> "What do you mean by 'loop'?"
> > > >> Track either program execution milliseconds within the script and use
> > > >> a header() redirect (to the same page with some extra param query
> > > >> info), or do the same thing but when tracking number of record
> > > >> insertions.  You could save the scraped data to a temporary file, grab
> > > >> the first 150-200 records, process those records, update the file
> > > >> (removing those records), reload the page (using header()) and
> > > >> continue processing until there are no more records in the file - then
> > > >> delete the temporary file.
>
> > > >> There are plenty of ways to get around this problem that are probably
> > > >> more efficient for the server than increasing the PHP execution time
> > > >> setting.
>
> > > >> On Jul 22, 3:54 pm, Brad C <[EMAIL PROTECTED]> wrote:
>
> > > >> > I have an app that screen scrapes a website and then inserts the data
> > > >> > into a mysql database. For the main request that loads the database,
> > > >> > there will be about 1100 inserts and another 1100 saves to existing
> > > >> > records.
>
> > > >> > I can run this load without problem when using xampp on my computer
> > > >> > (it takes over 60 seconds). When I run the app at1and1.com, it always
> > > >> > times out after 30 seconds. I have timed it with a watch, and it is
> > > >> > always 30 seconds.
>
> > > >> > The cake controller that performs the load echoes out data after each
> > > >> > record is inserted. It inserts anywhere between 236 and 245 records 
> > > >> > on
> > > >> > each attempt, so it isn't a particular record causing the problem.
>
> > > >> > I have turned up the cakephp debug setting to 3, but no SQL is output
> > > >> > as the script terminates before cake can output any of the framework
> > > >> > debug. I have set the PHP settings for 'display_errors', 
> > > >> > 'log_errors',
> > > >> > and 'error_log', but no php error message is ever shown or logged. 
> > > >> > The
> > > >> > 'max_execution_time' variable is set to 50000. I have verified these
> > > >> > settings by having my cake app display the phpinfo() data.
>
> > > >> > The cake log in /app/tmp/logs/error.log only shows this message
>
> > > >> > 2008-07-22 15:30:55 Error: next=229
>
> > > >> > That message is displayed regardless of how many records are
> > > >> > successfully inserted into the database.
> > > >> > There is no other error output anywhere.
>
> > > >> > Any ideas on this issue or suggestions on other ways to troubleshoot
> > > >> > this ?- Hide quoted text -
>
> - Show quoted text -
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to cake-php@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to