Hi. Thx for the answer.

Basically this mean that need to have a small service that will listen to 
that callback, but I don't have it in this case.

Second, how can I easily hook that callback into scrapyd core? Does it 
offer smth like this already?


On Tuesday, December 17, 2013 3:42:29 PM UTC+2, Sidnei Pereira wrote:
>
> You can build something like a URL callback.
>
> When you request the scrapyd to start crawling, pass an url callback to 
> system that your spider will use to update your system about the spider's 
> state.
>
> On Monday, December 16, 2013 3:23:24 AM UTC-2, Gheorghe Chirica wrote:
>>
>> Hi.
>>
>> I want to hook some custom logic to scrapyd spider state events (
>> Running,Pending,Finished). I know that there is an API method for that, 
>> but I don't want to call the API each time I need to find the spider state.
>>
>> I need to update DB automatically, when spider running via scrapyd, 
>> changes it's status. 
>>
>> I was wandering how can I achieve this with less overhead? From the code 
>> seems to me that I need to create a custom launcher and poller which will 
>> handle this states. 
>>
>> Is this a good approach ?
>>
>> Another idea is to not touch scrapyd code at all and work with scrapy 
>> spider signals:
>>     - spider_opened - running
>>     - spider_closed  - finished
>>     - ??                  - pending (here we can't use spider_idle for 
>> pending)
>>
>>
>>
>> Thx.
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to