Steven D'Aprano <steve+pyt...@pearwood.info> added the comment:

Sorry, this is for reporting bugs in the Python interpreter and standard 
library, not your own code. If you have a bug in your pyinator project, you 
should report it to yourself, not us :-)

If you think my analysis of the problem below is wrong, and that you have truly 
found a bug in the interpreter, please read this:

http://www.sscce.org/

and give us the smallest example of this bug you can, don't just link to your 
library and expect us to find the problem ourselves.

My analysis, based on a *brief* look at your project, it that every time you 
call GetPriceCheck, you increase the recursion limit by an extra 1000, and then 
keep making more and more recursive calls until you run out of memory and 
segfault. That's pretty much unavoidable.

To prevent that, you can:

- add more memory; 

- make fewer recursive calls;

- fix your code to use a better technique for scraping websites.

(It is rude to keep hitting a website over and over and over again, without any 
limit. Websites have limited bandwidth, which they pay for, and every time you 
hit the website, that makes it harder for somebody else. At the very least, you 
should back off exponentially, waiting longer between each attempt: 1 second, 2 
seconds, 4 seconds, 8 seconds, 16 seconds...)

The recursion limit is designed to prevent segfaults by giving you a nice 
Python-level exception instead of a hard, OS-level segmentation fault. But if 
you set the recursion limit too high, you by-pass that protection and you are 
responsible for not crashing the stack.

----------
nosy: +steven.daprano
resolution:  -> not a bug
stage:  -> resolved
status: open -> closed

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue34039>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to