I have the same issue here, if you know how to do this? Rosa if you managed this problem could you share the answer :) Thanks
Le samedi 19 mai 2012 13:03:50 UTC+2, Mahmoud Abdel-Fattah a écrit : > > Hi Rosa, did you manage to solve this problem ? > > On Friday, January 28, 2011 7:07:43 PM UTC+2, Rosa Luna wrote: >> >> Hi Pablo, >> >> Thanks for your answer. Sorry if it's a dummy question but i'm a novice >> in this... >> >> I need to add this in my spider file? >> >> Where about? >> >> Here is my spider: (i use the generic ones) >> >> import re >> >> from scrapy.selector import HtmlXPathSelector >> from scrapy.contrib.linkextractors.sgml import SgmlLinkExtractor >> from scrapy.contrib.spiders import CrawlSpider, Rule >> from anzse.items import AnzseItem >> >> class ExampleSpider(CrawlSpider): >> name = 'example' >> allowed_domains = ['example.com'] >> start_urls = ['http://www.example.com/'] >> >> rules = ( >> Rule(SgmlLinkExtractor(allow=r'Items/'), callback='parse_item', >> follow=True), >> ) >> >> def parse_item(self, response): >> hxs = HtmlXPathSelector(response) >> i = AnzseItem() >> #i['domain_id'] = >> hxs.select('//input[@id="sid"]/@value').extract() >> #i['name'] = hxs.select('//div[@id="name"]').extract() >> #i['description'] = >> hxs.select('//div[@id="description"]').extract() >> return i >> >> Pablo Hoffman wrote: >> > Hi Rosa, >> > >> > If you want to set cookies from your spider you can set the >> Request.cookies >> > attribute from the request objects you're returning. >> > >> > For example: >> > >> > request.cookies['code_pays'] = '2' >> > request.cookies['code_region'] = '0' >> > >> > On Fri, Jan 28, 2011 at 03:46:14PM +0100, Rosa (Anuncios) wrote: >> > >> >> Hi, >> >> >> >> I'm trying to scrape some data from a site that use cookie for the >> >> language of the site. >> >> >> >> How to pass to my spider this cookie value for language: >> >> >> >> Cookie: code_pays=2; code_region=0; >> >> >> >> In the spider? >> >> >> >> I don't know where to set up the CookiesMiddleware shown here >> >> :http://doc.scrapy.org/topics/downloader-middleware.html?? >> >> >> >> Thanks for your help to a new scrapy "lost" user ;-) >> >> >> >> Scrapy is awsome! >> >> >> >> -- >> >> You received this message because you are subscribed to the Google >> Groups "scrapy-users" group. >> >> To post to this group, send email to >> >> [email protected]<javascript:> >> . >> >> To unsubscribe from this group, send email to >> [email protected] <javascript:>. >> >> For more options, visit this group at >> http://groups.google.com/group/scrapy-users?hl=en. >> >> >> > >> > >> >> -- You received this message because you are subscribed to the Google Groups "scrapy-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/scrapy-users. For more options, visit https://groups.google.com/d/optout.
