I will admit I have litte direct knowlage about LTSP so far, 
however I do know from my own machine that the GNU/Linux shared memory 
system is very effective, and should benefit Linux terminal servers well.  
So I am developing a, hopefully, more accurate formula for calculating 
true min. memory.
        This will be based on a few assumptions that I will state first 
off, if any of these are grossly inaccurate for your situation, please let 
me know.
        -The server requires 47 megs from the start, I looked at my total 
memory and swap used and removed the cached buffers.
        -The (Redhat 7.2) default of KDE2 requires about 14 megs more(same 
process as above then minus the 47 megs)
        -GNU/Linux shared memory saves about one third on running memory 
for each next instance on most programs
        -Users are unlikely to all be running your largest app at the same 
time
        -imaginary largest app of 55 megs, which is pretty fat.
So my formula is as follows...
        -30 nodes

((Total_Nodes-1).66+1)(Environment_Memory_Footprint+Fatest_Application*.7)+Server_Memory_Footprint=Min_Server_Memory
((30-1).66+1)(14+55*.7)+47
(20.14)(52.5)+47
1104.35 megs

        Based on this, I would come up with a much lower requirement than 
the 50-60 megs per node rule that has been touted here.  Would the Admins 
on this list please compare my formula on your LTS net with the actual 
peak memory useage reported by top.  This way I can adjust the formula, 
and add variables as needed to create an accurate formula for true 
minimums.
        Then I will tackle CPU mins.




-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_____________________________________________________________________
Ltsp-discuss mailing list.   To un-subscribe, or change prefs, goto:
      https://lists.sourceforge.net/lists/listinfo/ltsp-discuss
For additional LTSP help,   try #ltsp channel on irc.freenode.net

Reply via email to