hi, my project involves lot of I/O over the network.. one part of my project involves a server(http) which is listening on the port for many client . this sever fetches an image from the web and and send it to clients .... and many clients will request the server concurrently .. to implement concurrent serving to clients i used threaded http server like this
class HTTPServer(SocketServer.ThreadingMixIn,BaseHTTPServer.HTTPServer): pass class RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler): def do_GET(self): print "received connection from: ",self.client_address image=image_retreive() #where image retreive is a function that retrieves image from the web and works fine self.send_response(200) self.send_header("Content-type",format) self.send_header("Content-Length",len(image_data)) self.end_headers() self.request.sendall(image_data) httpd = HTTPServer(('',port_number), RequestHandler) httpd.serve_forever() this code worked fine but this performance was very bad ... it workes fine if the clients requested for small n medium size images as the server sends the response immediately and also workes fine if one client is requesting a large image (obviously server takes time to send response as it takes time to fetch the image from the web ) and other clients concurrently request for small and medium images these clients will be served immediately even if the other client is waiting but problem crops up when 2 clients concurrently request for an large image .. while these two clients are waiting for the response fromthe server . The server doesn't accept any other client request ... i can see this as i am printing the address of the client that connects with server in the 1st line of get method of the request handler if two clients concurrently request for an large image and only two clients address gets printed that means only 2 clients receives connection to the server even if other clients are requesting the server at the same time and other servers are served only after the those 2 server releases the connection or get the response . that means server servers only 2 clients at a time .this is very undesirable as even if 3rd client is requesting for very small image and 2 clients are waiting for large image .. 3rd client won't receive the response until those 2 clients are served . to make thing worst my server should serve 10 to 15 clients concurrently to solve this i did some searching and found about cherrypy and twisted also implemented my server in cherrypy like this from cherrypy import wsgiserver def image_httpserver_app(environ, start_response): print >>sys.stdout,"received connection from: (%s : %s ) \nthe image url is: %s " % (environ["REMOTE_ADDR"],environ["REMOTE_PORT"],environ["QUERY_STRING"]) status = '200 OK' response_headers = [('Content-type',format)] image=image_retreive() response_headers = [("Content-Length",`len(image_data)`)] start_response(status, response_headers) return [image_data] mappings=[('/', image_httpserver_app)] wsgi_apps = mappings server = wsgiserver.CherryPyWSGIServer(('localhost', 8888), wsgi_apps, server_name='localhost',numthreads=20) if __name__ == '__main__': try: server.start() except KeyboardInterrupt: server.stop() this didn't solve the problem at all .. same thing is happening only 2 clients is served at a time ..even if no of threads is assigned to 20 .. i have did lot of searching and reading .. and hoping to find a solution ..can anyone make it easier for me i have heard of twisted deffered object .. will it solved the problem ? if not pls suggest me alternative..
-- http://mail.python.org/mailman/listinfo/python-list