You have to be more specific about your "crawler manager", there are 
several channels of communication you can use. You should check scrapyd is 
our "crawler manager".

El miércoles, 20 de agosto de 2014 12:39:19 UTC-3, [email protected] 
escribió:
>
> I want to be able to notify my crawler manager (some software im building) 
> that the crawl has completed so it can close down the aws instance it is 
> on. I notice you can do this with an extension but it seems a little hacky 
> is it the best way to do this currently? if so should i look into building 
> something into the spider to do this? and sending a pull request?
>
> Lewis
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to