Hi Reuti.
On 31.05.2017 11:53, Reuti wrote:
Am 30.05.2017 um 23:49 schrieb Roberto Nunnari <[email protected]>:
Dear Reuti.
Thank you very much for your help. I very much appreciate it. :-)
After some trouble (getting the right architecture and the man pages) I managed
to build it all (I hope).
I guess "sh scripts/bootstrap.sh && ./aimk" will build all with berkeleydb and
not -spool-classic, right?
Yes. But unless you have a really large cluster, there is no big difference.
Some even prefer classic-spooling, as you can read all text files without any
extra tools.
Could you please tell me what files will be clear text with classic and
binary with berkeleydb?
I'll let you know tomorrow how it goes.
For now it's ok.. sge_qmaster on the frontend and sge_execd on the
execute nodes work.
Now I need to export/import the config from old to the new server.
But I'm new to arc and that will be a bit of a challange. :-)
Can I add it later or I need to do it together with SoGE?
You mean ARCo for the accounting? As I wrote in the other eMail: we never used
it, as we never had a need for it. There was the idea to use it, but
essentially the pending queue is so short, and there are only short phases
where some waiting time exists, that we skipped this.
Yes. I mean ARCo for the accounting and I hope it can help understand
usage statistics in a better way that acct
(/opt/sge/default/common/accounting) does.
Roberto
_______________________________________________
users mailing list
[email protected]
https://gridengine.org/mailman/listinfo/users