I've done this for my project using the urllib and urllib2 python modules 
to call a PHP ftp api.  

You first want to store it in json by looping through the item and building 
associative key-values pairs, and then justing json.dumps() method for json 
serialization.  

I am sending my output file to a PHP FTP API , which takes in the file name 
and appends it to the file path, var/www/foo/bar/file_name.  Then the API 
PUTs this file to another location for database input using PHP's ftp_put 
http://php.net/manual/en/function.ftp-put.php.

my code looks like:
 def ftp_init(self, file_name):
        myParameters = {"file": file_name}


        ftp_url = "http://127.0.0.1/spider/send?%s"; % urllib.urlencode(
myParameters)
        self.log_info("Attempting to FTP to this: " + ftp_url + " 
url==============================================")
        urllib2.urlopen(ftp_url).read()




I hope that helps a bit!
On Saturday, November 1, 2014 12:32:41 PM UTC-4, Cen Wang wrote:
>
> After scraping, I have a Item, how can I serialize it and send it over 
> HTTP to a REST API endpoint?
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to