Maybe output all the stats into a file or a database and have another 
script do the daily reporting by grabbing data from there. Sounds like you 
really need 2 separate needs: scraping and reporting.

Cheers

On Monday, August 3, 2015 at 3:09:40 PM UTC+2, Sirbito X wrote:
>
> Hi,
>
> I'm trying to implement a crawler which runs every 5 minutes.
> The problem is the spider sends an email every time it is closed and it is 
> such a disaster for me!
> How should I sum up all the stats relevant to a spider and send an email 
> at the of the day?
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to