The problem I've run into more than memory is having the system CPU
time get out of control.  My guess is that the threshold for what is
considered "overloaded" is going to be dependent on your system setup,
what you're running on it, and what bounds your jobs.


On Tue, Jan 17, 2012 at 22:06, ArunKumar <arunk...@gmail.com> wrote:
>
>
> Guys !
>
> So can i say that if memory usage is more than say 90 % the node is
> overloaded.
> If so, what can be that threshold percent value or how can we find it ?
>
>
>
> Arun
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/How-to-find-out-whether-a-node-is-Overloaded-from-Cpu-utilization-tp3665289p3668167.html
> Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

Reply via email to