I'm always wonder how to access spider settings in spider.py, pipelines.py 
and extensions & middlewares, I've read the docs on how to access settings 
<http://doc.scrapy.org/en/latest/topics/settings.html#how-to-access-settings>, 
but there's no description on how to achieve the same in pipelines.py and 
spiders.py, so here's what I've done:

# spiders.py
from myproject import settings
# or I can do this
from myproject.settings import SPECIFIC
# deprecated
from scrapy.conf import settings

#------------------#

# piplines.py
from myproject import settings


the problem is, if I import settings.py as a module, there's no 
consistency, if I import settings by `from scrapy.conf import settings`, I 
got a deprecation warning, so how exactly could I import my crawler 
settings as a scrapy.settings.Settings object from inside my piplines and 
spiders?

P.S I've read this stackoverflow question 
<http://stackoverflow.com/questions/14075941/how-to-access-scrapy-settings-from-item-pipeline>
 
which is sort of the same, but all answers doesn't look right, you'll see 
once you get there.

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to