solr is a good option as mentioned in another reply.

And grassyknoll sounds good too, as mentioned in another reply.

You can also try using a Python FastCGI server if you want to use
Python, and use a front-end web server like lighttpd. It scales
up by having multiple instances of Python FastCGI servers running
and the web server load balances them.

If you use OS file caching, all Python processes will be sharing
the same file cache, which saves on memory.

-- 
Best regards,
Jack

Monday, March 19, 2007, 4:24:28 PM, you wrote:

> I need to build a lucene search engine that can handle very high loads
> (hundreds of requests per second) via a web interface.  It will be deployed
> on one or more multi-proc servers, with the index pre-generated and
> available via an NFS partition.  The index is small enough to fit into RAM,
> so assume linux will cache the whole thing (it seems to be now - there's no
> performance difference for me between FSDirectory and RAMDirectory).  I'm
> new to both Python and Lucene, so I have little experience with what the
> best solutions are.



__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 
_______________________________________________
pylucene-dev mailing list
[email protected]
http://lists.osafoundation.org/mailman/listinfo/pylucene-dev

Reply via email to