On this line 
<https://gist.github.com/gbirke/abc10c81aca8242b880a#file-multifeedexporter-py-L62>,
 
instead of counting the items by yourself and also will be added to the 
stats dict printed at the end.
This 
<https://gist.github.com/gbirke/abc10c81aca8242b880a#file-multifeedexporter-py-L50>
 
deferred stuff its nice, but not really needed if you aren't doing aync 
write/upload to the storage.

I really recommend you to do a PR the Exporter is good overall and you are 
going to get lot more feedback on github about the code.

El jueves, 4 de septiembre de 2014 07:39:14 UTC-3, Gabriel Birke escribió:
>
> Is this answer related to my question or did you mis-post?
>
> At which point in the MultiFeedExporter code should I use stats collector?
>
> Am Mittwoch, 3. September 2014 19:44:17 UTC+2 schrieb Nicolás Alejandro 
> Ramírez Quiros:
>>
>> May I suggest you using StatsCollector 
>> <http://doc.scrapy.org/en/latest/topics/api.html#scrapy.statscol.StatsCollector.inc_value>,
>>  
>> it is accessible on spider.crawler.stats and doing a PR on Scrapy would be 
>> nice too.
>>
>> El miércoles, 6 de agosto de 2014 05:31:51 UTC-3, Gabriel Birke escribió:
>>>
>>> I have a spider that returns different item types (books, authors and 
>>> publishers) and would like to export each item type to its own file, being 
>>> flexible about the format (CSV or JSON). My first approach would be to use 
>>> a pipeline class, but then I lose the easy functionality of specifying the 
>>> feed exporters on the command line. Now here is a bunch of questions:
>>>
>>> 1) Is this the right approach or should I implement this differently?
>>> 2) How can I reuse the exporters in a pipeline?
>>> 3) How can I pass command line arguments to my pipeline class?
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to