Is there a way to pass the Cookies to the spider_closed signal? I need to 
make a request to the logout url of a web page when the spider finishes 
crawling. And for the logout url to work, I need to logout with a specific 
PHPSESSID... I even managed to pass the parameter in a "ugly" way and 
Request(url=url, cookies={'PHPSESSID': '86nuu91s4aenh48il4najrpea5'}) but 
it does not work... 

I have COOKIES_DEBUG = True and I can see the cookie being passed after the 
login and I CAN scrape the secure pages. The problem is that I cannot 
logout and then, the session gets stuck in the site database and I have to 
wait 1h until I can login again... so I need to logout properly after I 
scrape...

Just as info, I'm ABLE to: 
curl --data "login=login&pass=pass" --cookie-jar ./somefile 
http://www.domain.com/login/ 
and then: 
curl --cookie ./somefile  http://www.domain.com/logout/

Thanks!

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to