On Sunday 09 November 2003 01:19 pm, Brandon Low wrote:
> Freenet needs more memory than JVMs allocate by default. Start it with
> -Xmx255M and it will be happy as a clam.
>
> --B
>
> On Sun, 11/09/03 at 01:07:08 -0600, Tom Kaitchuck wrote:
> > Here is an out of memory error form 5031. It keeps re
Freenet needs more memory than JVMs allocate by default. Start it with
-Xmx255M and it will be happy as a clam.
--B
On Sun, 11/09/03 at 01:07:08 -0600, Tom Kaitchuck wrote:
> Here is an out of memory error form 5031. It keeps recurring, even though the
> enviroment page says this:
>
> Maximum
Here is an out of memory error form 5031. It keeps recurring, even though the
enviroment page says this:
Maximum memory the JVM will allocate
192 MiB
Memory currently allocated by the JVM
130,112 KiB
Memory in use
131,983,552 Bytes
Estimated memory used by logger
None
Unused allocated memory
On Thu, Sep 05, 2002 at 01:54:07PM -0700, Ian Clarke wrote:
<>
> So - question - do OutputStreams/Writers to Servlets (as obtained from
> HttpServletResponse objects) need to be closed before the doGet() method
> of the servlet terminates?
resp.flushBuffer() should be called.
--
Oskar Sandbe
Some component is leaking memory. I have been suspecting that for some
time. What we need is for somebody to run some heap profiles on long
running nodes.
See: http://www.javaworld.com/javaworld/jw-12-2001/jw-1207-hprof.html
On Thu, Sep 05, 2002 at 04:05:42PM -0400, Dan Merillat wrote:
>
> Eith
Either we're reporting it now, or something is using a cubic fuckton
more RAM. I've got the VM limit set at 96 meg, and I'm sorry, but that's
"More Then Enough" to do the job. I've only got a half gig on this
box, so running to 128meg is going to start hurting (especially if there's
a significan
On Thu, Sep 05, 2002 at 01:54:07PM -0700, Ian Clarke wrote:
<>
> So - question - do OutputStreams/Writers to Servlets (as obtained from
> HttpServletResponse objects) need to be closed before the doGet() method
> of the servlet terminates?
resp.flushBuffer() should be called.
--
Oskar Sandb
Given that this just started to happen with Hawk after I upgraded to the
new GUI - I suspect that this could be where the problem lies.
One thing I have noticed is that whether I access one of these pages, even
after it has finished loading - Mozilla still says "Tranferrring data from
..." in i
Given that this just started to happen with Hawk after I upgraded to the
new GUI - I suspect that this could be where the problem lies.
One thing I have noticed is that whether I access one of these pages, even
after it has finished loading - Mozilla still says "Tranferrring data from
..." in
Some component is leaking memory. I have been suspecting that for some
time. What we need is for somebody to run some heap profiles on long
running nodes.
See: http://www.javaworld.com/javaworld/jw-12-2001/jw-1207-hprof.html
On Thu, Sep 05, 2002 at 04:05:42PM -0400, Dan Merillat wrote:
>
> Eit
Either we're reporting it now, or something is using a cubic fuckton
more RAM. I've got the VM limit set at 96 meg, and I'm sorry, but that's
"More Then Enough" to do the job. I've only got a half gig on this
box, so running to 128meg is going to start hurting (especially if there's
a significa
11 matches
Mail list logo