I asked something similar earlier, and was pointed to MogileFS by danga (danga.com)


It looks messy and scary, but seems to work. It seems to use a daemon coupled with replicated mysql to sync content across servers.



On Feb 20, 2005, at 8:49 PM, Mike OK wrote:

Hi

���� I am looking for some opinions on how to best handle this situation.� I am willing to do all of the research about each topic but would like some points of view on how others would handle this.� Thanks Mike

- I have developed a search engine that caches the results
- Currently I save the results in a text file�on the local machine that runs apache
- This system works well for the test system�but I need to have a central storage system for these requests
- I plan on having over 250 and up to 1000's different servers saving and retrieving information from this central system

- I currently have a linux box with the reiser filesystem installed ready for this central system because it handles lots and lots of small files very well
- My thought is that the file system would be quicker for lookups than a MySQL database (I use it in other areas of the system)

- I have looked at cpan and have found a module file::remote that I could use but I feel there might be a more efficient solution

- Since this is happening on a local network, I could assign a port for each server to save connection overhead

Any thoughts on this subject would be greatly appreciated.�� Thanks again.




Reply via email to