Hi, 

I am trying to stagger two scrapy spider to run at different time in 
python. It seems that I am facing quite some issue here. 

My logic is as follows: 

crawler = Crawler(settings)

crawler.signals.connect(reactor.stop, signal=signals.spider_closed)
crawler.configure()
crawler.crawl(spider1)
crawler.start()
log.start(logstdout=False)
reactor.run()
..
Do some work after spider crawled finish

...

crawler = Crawler(settings)
crawler.signals.connect(reactor.stop, signal=signals.spider_closed)
crawler.configure()
crawler.crawl(spider2)
crawler.start()
log.start(logstdout=False)
reactor.run() 

The problem is that once reactor is stop, it can't be restart. Is there any way 
I can stagger my spider ?

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to