So there is a couple of week that I search on the web to get my answer but 
never found it. Maybe I'm searching bad or what, but this is my first time 
asking on a website for developing (I'm quiet desperate…)

I have to crawl a Web Site, but I need to pass a cookie to bypass the first 
page (which is a kind of login page, where you only choose you location)

I heard on the web that you need to do this with a base Spider (not a Crawl 
Spider), but I need to use a Crawl Spider to do my crawling, so what do I 
need to do?

At first a Base Spider? then launch my Crawl spider? But I don't know if 
cookie will be passed between them or how do I do it? How to launch a 
spider from another spider?

How to handle cookie? I tried with this

def start_requests(self):
   yield Request(url='http://www.auchandrive.fr/drive/St-Quentin-985/', 
cookies={'auchanCook': '"985|"'})

But not working

My answer should be here, but it is really evasive and I don't know what to 
do...https://groups.google.com/forum/#!topic/scrapy-users/pNYSlOn_aWQ

I'm really lost on this, if someone knows :)

Thanks!

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to