Although it looks like it's made to serve scheduling spiders in multiple 
versions,
the reason is quite elementary.
When uploading a new version, there may still be spiders from an older 
version running.

Do you need to be able to schedule multiple versions?
It should be possible to build such a feature.


On Sunday, 24 January 2016 21:52:03 UTC+2, Sergey Zhemzhitsky wrote:
>
> Hi there,
>
> I cannot figure out what is the purpose of "version" parameter in 
> "addversion.json" call to scrapyd.
> Although it is possible to add and delete versions with "addversion.json" 
> and "delversion.json" it is not possible to specify a version when 
> scheduling a crawling with "schedule.json" that seems to be pretty 
> natural.
>
> If, let's say, I would like to create two versions "1.0.0" and "2.0.0" 
> both of which contain spider with the same name "MySpider" and then to 
> schedule this spider, spider from what egg is going to be scheduled?
>
> $ curl http://localhost:6800/addversion.json project=my_spiders -F 
> version=1.0.0 -F egg=@my_spiders-1.0.0-py2.7.egg
> $ curl http://localhost:6800/addversion.json project=my_spiders -F 
> version=2.0.0 -F egg=@my_spiders-1.0.0-py2.7.egg
>
> $ curl http://localhost:6800/schedule.json -d project=my_spiders -d 
> spider=MySpider
>
>
> Maybe I'm missing something? Could you please help? 
>
> Kind regards,
> Sergey
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to