There are only a few kind of errors that halt scrapy exection. Runtime spider errors should not prevent scrapy from continue running. Can you elaborate on the kind of errors you mean?
On Mon, Sep 15, 2014 at 10:42 AM, Pedro Henrique <[email protected]> wrote: > How do I handle exceptions in scrapy so it does not stop execution? Is > there any way to globally handle exceptions and scrapy keep making new > requests? > > -- > You received this message because you are subscribed to the Google Groups > "scrapy-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > To post to this group, send email to [email protected]. > Visit this group at http://groups.google.com/group/scrapy-users. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "scrapy-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/scrapy-users. For more options, visit https://groups.google.com/d/optout.
