Re: [Tutor] : breaking out of a function that takes too long
On Wed, Sep 16, 2009 at 6:07 AM, C or L Smith wrote: >>> Serdar wrote: > ... >>> So again, is there a way to place a time limit on the execution of a >>> function, after which you can break out of it and then retry it or >>> move along with the rest of your program? > > At http://tinyurl.com/rbre9n you can find a recipe that tells you how to > decorate a function so it will return after a given amount of time. According > to the comments, this will leave the process still running, but at least the > whole program won't hang. I've used it successfully from time to time. All that recipe really does is to tell you that the function has taken too long. It doesn't stop it. Kent ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor
[Tutor] : breaking out of a function that takes too long
>> Serdar wrote: ... >> So again, is there a way to place a time limit on the execution of a >> function, after which you can break out of it and then retry it or >> move along with the rest of your program? At http://tinyurl.com/rbre9n you can find a recipe that tells you how to decorate a function so it will return after a given amount of time. According to the comments, this will leave the process still running, but at least the whole program won't hang. I've used it successfully from time to time. /c ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] breaking out of a function that takes too long
Thanks to all for the responses. I've never used threading, but from some initial googling it appears that it can indeed by tricky and there seem to be numerous recommendations against killing threads or processes. I'll explore the socket request libraries mentioned by several to see if that does the trick. Many thanks! ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] breaking out of a function that takes too long
Most socket request libraries have timeouts on connection attempts, have you looked into that already? On Tue, Sep 15, 2009 at 4:02 PM, Serdar Tumgoren wrote: > Hey everyone, > Is there a way to break out of a function if it exceeds a certain time > limit for execution? > > There's a Website I'm scraping on a regular basis, and for some reason > that I can't divine the site has radically varying response times. In > some cases, I get a result vary quickly; at other times, my entire > program hangs because of this one call to an external source. Even > more strange, if I quit the scrape when it appears to hang and then > try it again a few moments later, it will sometimes work just fine > ("sometimes" being the key word). I'm not certain, but I'm guessing > this inconsistent behavior has something to do with the site that I'm > scraping. > > So again, is there a way to place a time limit on the execution of a > function, after which you can break out of it and then retry it or > move along with the rest of your program? > > TIA, > Serdar > ___ > Tutor maillist - Tutor@python.org > To unsubscribe or change subscription options: > http://mail.python.org/mailman/listinfo/tutor > ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] breaking out of a function that takes too long
On Tue, Sep 15, 2009 at 5:02 PM, Serdar Tumgoren wrote: > Hey everyone, > Is there a way to break out of a function if it exceeds a certain time > limit for execution? In general this is tricky. It usually involves setting up another thread to run or monitor the function, then somehow terminating the thread if it runs too long. In your case, it sounds like a better solution is to set a socket timeout. At the beginning of your program - before attempting any access to the website - put import socket socket.setdefaulttimeout() This will cause the web site fetch to fail by raising socket.timeout if the fetch blocks for longer than the timeout seconds. Kent ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor
Re: [Tutor] breaking out of a function that takes too long
"Serdar Tumgoren" wrote Is there a way to break out of a function if it exceeds a certain time limit for execution? Yes, you can put the call in a thread then in the main thread start a timer. If the timer expuires before the thread returns kill the thread.thread There's a Website I'm scraping on a regular basis, and for some reason that I can't divine the site has radically varying response times. Could be a lot of people have written scrapers and they all schedule them to run at reulat intervals... If the clocks all suddenly align you will get a massive demand on the server (its effectively like a denial of service attack on the server!). Or it could even be real users if it happens when some significant change occurs - like stock/share price web sites often get big peaks at market open and close times. try it again a few moments later, it will sometimes work just fine ("sometimes" being the key word). I'm not certain, but I'm guessing this inconsistent behavior has something to do with the site that I'm scraping. It could be anywhere in your network connection, including your ISP. If its a local operator there may be peak periods when people go online based on local patterns of usage, especially if its ot a big company - they might only have a couple of 2M pipes say to the core network. HTH, -- Alan Gauld Author of the Learn to Program web site http://www.alan-g.me.uk/ ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor
[Tutor] breaking out of a function that takes too long
Hey everyone, Is there a way to break out of a function if it exceeds a certain time limit for execution? There's a Website I'm scraping on a regular basis, and for some reason that I can't divine the site has radically varying response times. In some cases, I get a result vary quickly; at other times, my entire program hangs because of this one call to an external source. Even more strange, if I quit the scrape when it appears to hang and then try it again a few moments later, it will sometimes work just fine ("sometimes" being the key word). I'm not certain, but I'm guessing this inconsistent behavior has something to do with the site that I'm scraping. So again, is there a way to place a time limit on the execution of a function, after which you can break out of it and then retry it or move along with the rest of your program? TIA, Serdar ___ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor