Hi Everyone

I thought I'd post this one here and get a general feel for what people tend to do given this scenario..

I have some money to spend on servers and have carte blanche to do whatever I like as I see fit with the budget I have.

This particular group uses statistical genetics software to essentially crunch through numbers and output a result at the end. Having looked at a few of the programs they take up 100% processor usage once they kick off. On the multicore machines they just take over another one of the idle processors and that similarly runs at 100% until the calculations are done.

Typically users will log into a free machine and set off an analysis process - the process has been known to last a couple of weeks on the bigger chromasomes and datasets in returning all the data they require.

So onto my question - would I be better in linking the machines as a cluster which is all new to me, or just using separate machines as we currently do?

The budget will buy six dell quad core opterons PE2970 machines from my first quote back from dell.

I'm more than happy to read and learn so if anyone has some good links I'd be grateful, similarly if you run something like this I'd be glad to hear of your experience.

Thanks

Bryan

_______________________________________________
rhelv5-list mailing list
[email protected]
https://www.redhat.com/mailman/listinfo/rhelv5-list

Reply via email to