I need to crawl a site and purge every URL on it. I can purge an individual 
URL with something like: curl -X PURGE -H "X-Purge-Auth:blahblah" 
http://www.about.com/whatever.htm

I would like to automate this process with scrapy. It was relatively 
trivial to make scrapy follow the links and output a nice list of URLs, and 
I could feed that list into a script to purge them, but I would prefer to 
tell scrapy to do it. How can I send the purge command to every matching 
URL that scrapy finds?

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to