I'm running scrapy as a cron job, and so all output that is sent to stdout 
gets emailed to me at the end of the day, which is currently in the dozens 
of MB. Most of the log lines are INFO messages that I'm trying to suppress, 
but I still want WARNING, ERROR and CRITICAL to be printed to stdout so 
that those get emailed to me. 

I know about the logging settings, and am currently using:

```
LOG_LEVEL = 'WARNING'
LOG_FILE = '/path/to/scrapy.log'
LOG_STDOUT = False
```

in my `settings.py`. These settings seem to be doing the right thing in 
terms of the log *file* -- only logging the right messages -- but I'm still 
seeing everything (including INFO) printed to stdout. I've also tried 
running the scraper with the `scrapy crawl <spider> -L WARNING` flag, but 
I'm still seeing INFO message on stdout.

Is there a setting I'm missing somewhere that controls what gets sent to 
stdout? I don't want to pipe it to /dev/null since I still want WARNINGS 
and up to be sent to stdout. But I don't see a way to do this.

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to