Hugo,

When you use a log file specified on the command line, the messages are
written to the file instead of standard out (what you see in a terminal).

One way around this is to open another terminal session, and tail the file
in realtime.  So if your logfile specified with the -s options is
/var/log/scrapy/logfile1, you could run scrapy with that option, and in
your other terminal window type:

$ tail -f /var/log/scrapy/logfile1

That will display the file to your terminal as it's written to disk.

On Wed, Oct 22, 2014 at 11:12 PM, Hugo Maugey <[email protected]> wrote:

> Hi,
>
> I'm wondering if it's possible to display logs in terminal as well as
> saving them into a file.
>
> To save it to a file I do :
> scrapy crawl MY_SPIDER -s LOG_FILE=scrapy.log
>
> But then I don't have shell display any more ... and when logs are very
> long I can't see all of them in the terminal, so why is my question !
>
> Thanks
>
> --
> You received this message because you are subscribed to the Google Groups
> "scrapy-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/scrapy-users.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to