RE: memory issues with live redeploy

2005-06-28 Thread Scott Stewart
We ran into the same problem up thru version 5.5.7 - the only work around
was to use Stop/Start rather than Reload/Redeploy.  We've since updated to
Tomcat 5.5.9, which seems to have corrected the problem.

Thanks,

Scott Stewart


-Original Message-
From: George Finklang [mailto:[EMAIL PROTECTED]
Sent: Tuesday, June 28, 2005 5:34 PM
To: tomcat-user@jakarta.apache.org
Subject: memory issues with live redeploy


Working with a couple different tomcat 5.0.X versions, I'm having
issues with tomcat's memory footprint increasing when I live redeploy.
 I do this 2 or 3 times and the server runs out of memory, though
normally it can run for weeks without problem.

I can't seem to find direct references to this in the online docs.  Is
it jsp compilation or session state usage (or other things I can
affect)?  Or is it just a known limitation in the container?

--George

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory Issues

2004-04-05 Thread Shapira, Yoav

Hi,
Note as an aside that the JVM will use as much memory as you allow it
within tuning parameters, i.e. if you give it 1500MB max (and don't
change other defaults), it won't start doing full GCs until it has most
of that allocated, even if most of it can be reclaimed.

Yoav Shapira
Millennium Research Informatics


>-Original Message-
>From: shyam [mailto:[EMAIL PROTECTED]
>Sent: Saturday, April 03, 2004 1:37 PM
>To: 'Tomcat Users List'
>Subject: Memory Issues
>
>Hi All,
>
>I have an application running on tomcat4.1.24. I have allocated 1500 mb
>to the java vm. What happens is in 3-4 days my total memory reaches the
>1500mb. The free memory is low. I force the GC to run so that the
memory
>is freed. What I don't understand why is the total memory maximum. I
>want tomcat to deallocate and resize the heap so that the total memory
>doesn't max out. And also I need the GC to run more aggressively .
>
>My JAVA_OPTS are java -noclassgc -jmx1500 -jms1500.
>
>Any help in this issue will be grateful.
>
>Thanks
>shyam
>
>
>
>-
>To unsubscribe, e-mail: [EMAIL PROTECTED]
>For additional commands, e-mail: [EMAIL PROTECTED]




This e-mail, including any attachments, is a confidential business communication, and 
may contain information that is confidential, proprietary and/or privileged.  This 
e-mail is intended only for the individual(s) to whom it is addressed, and may not be 
saved, copied, printed, disclosed or used by anyone else.  If you are not the(an) 
intended recipient, please immediately delete this e-mail from your computer system 
and notify the sender.  Thank you.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory Issues

2004-04-03 Thread shyam
Hi,
Thanks a lot again. I just checked the release notes of java 1.4.2. The
memory leak problem is fixed in 1.4.2. We will upgrade our VM. And I
guess this is the problem coz I am using StringBuffer extensively. Will
also run the profiler to check for any other memory leaks.

Thanks
shyam





-Original Message-
From: SH Solutions [mailto:[EMAIL PROTECTED] 
Sent: Saturday, April 03, 2004 3:22 PM
To: 'Tomcat Users List'
Subject: RE: Memory Issues

Hi

> Thanks a lot for the information. I am using java 1.4.1_02 . 

I am not sure, maybe someone else can jump in, but I think, this was one
the
those VM releases I mentioned here:

> It was said, that where were StringBuffer memory leaks in at least one
of
the recent versions.

So this might be the first thing to update instantly.

Regards,
  Steffen


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory Issues

2004-04-03 Thread SH Solutions
Hi

> Thanks a lot for the information. I am using java 1.4.1_02 . 

I am not sure, maybe someone else can jump in, but I think, this was one the
those VM releases I mentioned here:

> It was said, that where were StringBuffer memory leaks in at least one of
the recent versions.

So this might be the first thing to update instantly.

Regards,
  Steffen


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory Issues

2004-04-03 Thread shyam


Hi,
Thanks a lot for the information. I am using java 1.4.1_02 . 

> I force the GC to run so that the memory is freed.

No. You can not. You can tell the system, that it should. You cannot
enforce
it.

I ask the System to the GC. Sorri for the wrong interpretation.

We are pretty soon moving to tomcat5.0.19. But it will take some time.
Until then I don't to see outofmemory errors. I guess I should ask the
System to run the GC .

Regarding memory leaks, I realize that I need to use a profiler . I will
do it .
Thanks again
shyam
-Original Message-
From: SH Solutions [mailto:[EMAIL PROTECTED] 
Sent: Saturday, April 03, 2004 1:56 PM
To: 'Tomcat Users List'
Subject: RE: Memory Issues

Hi


> I have an application running on tomcat4.1.24.

Upgrade. 4.1.30 is latest in 4.1.x.
I can even recommend 5.0.19, that I am running without problems. For
some it
has shown to be even faster.


> I force the GC to run so that the memory is freed.

No. You can not. You can tell the system, that it should. You cannot
enforce
it.


> What I don't understand why is the total memory maximum.

USE A PROFILER. This is a important step. You may be able to adjust some
parameters to take the OoME to happen less frequently, but it will NOT
solve
your memory leaks.


> I want tomcat to deallocate and resize the heap so that the total
memory
doesn't max out.

Tomcat cannot deallocate anything, since tomcat is also only a java
program.
If you do have memory leaks, no java code can change anything. Solve
your
leaks.


> And also I need the GC to run more aggressively.

There are some VM options that allow you to control the way, GC works.
Read
its release notes.

BTW, which VM are you using? Upgrade to latest. It was said, that where
were
StringBuffer memory leaks in at least one of the recent versions.


Regards,
  Steffen


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory Issues

2004-04-03 Thread SH Solutions
Hi


> I have an application running on tomcat4.1.24.

Upgrade. 4.1.30 is latest in 4.1.x.
I can even recommend 5.0.19, that I am running without problems. For some it
has shown to be even faster.


> I force the GC to run so that the memory is freed.

No. You can not. You can tell the system, that it should. You cannot enforce
it.


> What I don't understand why is the total memory maximum.

USE A PROFILER. This is a important step. You may be able to adjust some
parameters to take the OoME to happen less frequently, but it will NOT solve
your memory leaks.


> I want tomcat to deallocate and resize the heap so that the total memory
doesn't max out.

Tomcat cannot deallocate anything, since tomcat is also only a java program.
If you do have memory leaks, no java code can change anything. Solve your
leaks.


> And also I need the GC to run more aggressively.

There are some VM options that allow you to control the way, GC works. Read
its release notes.

BTW, which VM are you using? Upgrade to latest. It was said, that where were
StringBuffer memory leaks in at least one of the recent versions.


Regards,
  Steffen


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Memory issues with large query storage

2003-03-07 Thread Peter Lin

Here is another approach that might work.

1. create a class that implements
ServletContextListener and add it to your web.xml

2. have this new class be application wide and
persistent.

3. when ever a jsp page needs data, it doesn't go to
the database, it uses the persistent app.

4. make it so the persistent application uses another
thread to check the database for updates. If you were
using Oracle 8i or 9i, you could use java triggers to
push updates out to the webserver. Since you're using
MS SQL server, I would write a stored procedure to
insert the updates into a separate. This way, when
your persistent app checks for updates, it simply
looks at the update table and not the table containing
all the data.

5. another option to poling the database is to have
all updates go through tomcat, but at that point you
start writing your EJB. keep in mind full blown EJB is
made for transactional requests, so if you implement
EJB, use stateless. I would highly recommend doing
quite a bit of research about EJB's before you try to
implement it.

hope that helps.


peter





--- "Shapira, Yoav" <[EMAIL PROTECTED]> wrote:
> 
> Howdy,
> Where's the EJB running?
> 
> If it's in the same VM as tomcat, one thing you can
> do is junk this
> design and redesign with EJBs at all.
> 
> Another idea is to retrieve data in batches from the
> DB, extract only
> the information you need for graphing into a
> temporary object, get the
> next batch (not keeping the previous in memory),
> etc, until the object
> holds all and only the data you need to graph.  Then
> graph it and
> discard the object.  The idea here is to keep as
> little in memory as
> possible.  Execution speed may be sacrificed a
> little bit, but if you
> design it right and use a connection pool, batch
> querying (use a
> PreparedStatement as well maybe) can be fast enough
> for your needs.
> 
> Yet another idea is that there are frameworks out
> there to implement
> iterating over a database query result for you, e.g.
> TICL.
> 
> Finally, what's wrong with increasing -Xmx?  If
> you've tuned everything
> you can in your app, and it still needs some memory,
> then give it the
> memory ;)  Your environment may dictate otherwise,
> but usually physical
> memory is cheap and can only benefit you.
> 
> Yoav Shapira
> Millennium ChemInformatics
> 
> 
> >-Original Message-
> >From: Andrew Latham [mailto:[EMAIL PROTECTED]
> >Sent: Thursday, March 06, 2003 10:34 PM
> >To: [EMAIL PROTECTED]
> >Subject: Memory issues with large query storage
> >
> >I hope someone can assist here.
> >
> >Basically I have been writing a 3 Tier Web app
> using Tomcat and trying
> >to adhere to the Struts framework. The application
> connects to an
> >existing MSSQL database and creates graphs of the
> data. The way I've
> >built it is I define an entity bean to store a row
> of returned data
> from
> >the select query. Each entity bean is appended to a
> Collection and then
> >made available to the JSP for me to access using
> . This
> >all works fine, until the Collection.size() reaches
> numbers like
> 600,000
> >and then I hit memory issues. I know there is not
> much I can do but
> >increase -Xmx and page it or buy more physical
> memory.
> >
> >The question I have relates to displaying the data.
> Some of the output
> I
> >want in a table. Showing all results returned is
> not good if its more
> >than 100, so I created a "google" type < 10, next 10>> to
> cycle
> >through them, so for this to happen I need the
> Collection of what can
> be
> >100,000 beans to sit in memory as having to go back
> and get the next 10
> >from the SQL database is about 2 minutes each time.
> When the Collection
> >is returned it has a scope of "Request". To make
> this available to
> >another JSP page (for the next 10) I rescoped the
> Collection to
> >"session". This introduces a memory problem where
> it sits around until
> >the session ends and the GC picks it up. Is there a
> better way to pass
> >the Collection from one JSP to another without
> re-scoping it or what I
> >can see happen is that if a user runs many queries
> they all sit around
> >until the session ends utilising memory.
> >
> >I've thought I've setting the Collection to NULL
> when a new query is
> >run. Is this an acceptable method?
> >
> >Appreciate any advice
> >
> >Thanks
> >
> >Andrew Latham
> >
> >
> >
> >*--
> >This message and any attachment(s) is intended only
> for the use of the
> >addressee(s) and may contain information that is
> PRIVILEGED and
> >CONFIDENTIAL. If you are not the intended
> addressee(s), you are hereby
> >notified that any use, distribution, disclosure or
> copying of this
> >communication is strictly prohibited. If you have
> received this
> >communication in error, please erase all copies of
> the message and its
> >attachment(s) and notify the sender or Kanbay
> postmaster immediately.
> >
> >Any views expressed in this message are those of
> the individual sender
> and
> >not of Ka

RE: Memory issues with large query storage

2003-03-07 Thread Shapira, Yoav

Howdy,
Where's the EJB running?

If it's in the same VM as tomcat, one thing you can do is junk this
design and redesign with EJBs at all.

Another idea is to retrieve data in batches from the DB, extract only
the information you need for graphing into a temporary object, get the
next batch (not keeping the previous in memory), etc, until the object
holds all and only the data you need to graph.  Then graph it and
discard the object.  The idea here is to keep as little in memory as
possible.  Execution speed may be sacrificed a little bit, but if you
design it right and use a connection pool, batch querying (use a
PreparedStatement as well maybe) can be fast enough for your needs.

Yet another idea is that there are frameworks out there to implement
iterating over a database query result for you, e.g. TICL.

Finally, what's wrong with increasing -Xmx?  If you've tuned everything
you can in your app, and it still needs some memory, then give it the
memory ;)  Your environment may dictate otherwise, but usually physical
memory is cheap and can only benefit you.

Yoav Shapira
Millennium ChemInformatics


>-Original Message-
>From: Andrew Latham [mailto:[EMAIL PROTECTED]
>Sent: Thursday, March 06, 2003 10:34 PM
>To: [EMAIL PROTECTED]
>Subject: Memory issues with large query storage
>
>I hope someone can assist here.
>
>Basically I have been writing a 3 Tier Web app using Tomcat and trying
>to adhere to the Struts framework. The application connects to an
>existing MSSQL database and creates graphs of the data. The way I've
>built it is I define an entity bean to store a row of returned data
from
>the select query. Each entity bean is appended to a Collection and then
>made available to the JSP for me to access using . This
>all works fine, until the Collection.size() reaches numbers like
600,000
>and then I hit memory issues. I know there is not much I can do but
>increase -Xmx and page it or buy more physical memory.
>
>The question I have relates to displaying the data. Some of the output
I
>want in a table. Showing all results returned is not good if its more
>than 100, so I created a "google" type <> to
cycle
>through them, so for this to happen I need the Collection of what can
be
>100,000 beans to sit in memory as having to go back and get the next 10
>from the SQL database is about 2 minutes each time. When the Collection
>is returned it has a scope of "Request". To make this available to
>another JSP page (for the next 10) I rescoped the Collection to
>"session". This introduces a memory problem where it sits around until
>the session ends and the GC picks it up. Is there a better way to pass
>the Collection from one JSP to another without re-scoping it or what I
>can see happen is that if a user runs many queries they all sit around
>until the session ends utilising memory.
>
>I've thought I've setting the Collection to NULL when a new query is
>run. Is this an acceptable method?
>
>Appreciate any advice
>
>Thanks
>
>Andrew Latham
>
>
>
>*--
>This message and any attachment(s) is intended only for the use of the
>addressee(s) and may contain information that is PRIVILEGED and
>CONFIDENTIAL. If you are not the intended addressee(s), you are hereby
>notified that any use, distribution, disclosure or copying of this
>communication is strictly prohibited. If you have received this
>communication in error, please erase all copies of the message and its
>attachment(s) and notify the sender or Kanbay postmaster immediately.
>
>Any views expressed in this message are those of the individual sender
and
>not of Kanbay.
>
>Although we have taken steps to ensure that this e-mail and any
>attachment(s) are free from any virus, we advise that in keeping with
good
>computing practice the recipient should ensure they are actually virus
>free.




This e-mail, including any attachments, is a confidential business communication, and 
may contain information that is confidential, proprietary and/or privileged.  This 
e-mail is intended only for the individual(s) to whom it is addressed, and may not be 
saved, copied, printed, disclosed or used by anyone else.  If you are not the(an) 
intended recipient, please immediately delete this e-mail from your computer system 
and notify the sender.  Thank you.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Memory issues with large query storage

2003-03-06 Thread Lloyd A. Duke
Saving 600K rows of data in memory for a user is quite extreme.
There are several alternatives here. Why save the entire row?
Typically I see (and do) the saving of id's instead of the entire row. 
Then subsequent calls use the id for the lookup.  Caching can help the 
performance of the trip to the database... although thats kinda what 
your problem is here (not enough memory)

Paging (your google previous, next) is typically done using only the 
records you display and perhaps the *next* and *previous* records and 
the number span you are viewing. a good example of paging can be found 
in suns petstore sample application (http://java.sun.com/blueprints/code/)

A good look over that should save you lots of pain.

regards,
Lloyd
Andrew Latham wrote:

I hope someone can assist here.

Basically I have been writing a 3 Tier Web app using Tomcat and trying
to adhere to the Struts framework. The application connects to an
existing MSSQL database and creates graphs of the data. The way I've
built it is I define an entity bean to store a row of returned data from
the select query. Each entity bean is appended to a Collection and then
made available to the JSP for me to access using . This
all works fine, until the Collection.size() reaches numbers like 600,000
and then I hit memory issues. I know there is not much I can do but
increase -Xmx and page it or buy more physical memory.
The question I have relates to displaying the data. Some of the output I
want in a table. Showing all results returned is not good if its more
than 100, so I created a "google" type <> to cycle
through them, so for this to happen I need the Collection of what can be
100,000 beans to sit in memory as having to go back and get the next 10
from the SQL database is about 2 minutes each time. When the Collection
is returned it has a scope of "Request". To make this available to
another JSP page (for the next 10) I rescoped the Collection to
"session". This introduces a memory problem where it sits around until
the session ends and the GC picks it up. Is there a better way to pass
the Collection from one JSP to another without re-scoping it or what I
can see happen is that if a user runs many queries they all sit around
until the session ends utilising memory.
I've thought I've setting the Collection to NULL when a new query is
run. Is this an acceptable method?
Appreciate any advice

Thanks

Andrew Latham



*--
This message and any attachment(s) is intended only for the use of the addressee(s) 
and may contain information that is PRIVILEGED and CONFIDENTIAL. If you are not the 
intended addressee(s), you are hereby notified that any use, distribution, disclosure 
or copying of this communication is strictly prohibited. If you have received this 
communication in error, please erase all copies of the message and its attachment(s) 
and notify the sender or Kanbay postmaster immediately.
Any views expressed in this message are those of the individual sender and not of Kanbay.

Although we have taken steps to ensure that this e-mail and any attachment(s) are free from any virus, we advise that in keeping with good computing practice the recipient should ensure they are actually virus free.

 





-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: Memory issues

2002-12-12 Thread Leo Przybylski
On Thu, 2002-12-12 at 10:03, Shapira, Yoav wrote:
> Hi,
> What OS (and version) and JDK version you're using?
> 
> It could be that your OptimizeIt version is too old (it's Borland, not
> Inprise, for a while now) to run with the JDK you're using?  I like
> OptimizeIt, been using it since 4.0, and it's a good tool.
The system is Redhat 7.2 with default SMP kernel. I'm using jdk1.3.1_06
(Borland states this should be fine in retrospect to jdk1.4.1). I am
only using the trial version for profiling right now. We are evaluating
and may purchase the full version.

> 
> If you're using linux, the thread memory usage is cumulative not
> separate, despite the display of the "top" command.  So you don't have
> 57*64, but 57 threads sharing 64MB.  This has been discussed on this
> list many times: search the archives for more info on linux memory
> usage.
> 
> >of memory" error. Perhaps that is what I am noticing. I still do not
> >know why I am getting it though.
> 
> Your profiler should tell you where memory is allocated.
> 
> >> Could it be that you simply need more than the default 64MB memory?
> >I don't know. The machine has 1GB of physical and 2GB virtual memory.
> >With the total of java processes at 64MB each and 57 of them running,
> my
> >machine is almost depleated.
> 
> See above regarding interpretation of thread memory usage.
> 
> The heap is shared among all threads in the JVM, so what you specify as
> -Xmx is not per-thread and your machine physical memory will not be
> depleted.

Thank you. I will read up on this. I had no idea. I had been using
vmstat, top, and /proc to get information related to how much memory I
am using. All this time, I did not realize they were inaccurate because
they were indiscriminant of shared memory.

When gtop reported there was 1GB resident for java, it made our system
administrators drop a pantload and they flamed me.

-Leo
> 
> Yoav Shapira
> Millennium ChemInformatics
> 
> --
> To unsubscribe, e-mail:   
> For additional commands, e-mail: 
> 

--
To unsubscribe, e-mail:   
For additional commands, e-mail: 




RE: Memory issues

2002-12-12 Thread Shapira, Yoav
Hi,
What OS (and version) and JDK version you're using?

It could be that your OptimizeIt version is too old (it's Borland, not
Inprise, for a while now) to run with the JDK you're using?  I like
OptimizeIt, been using it since 4.0, and it's a good tool.

If you're using linux, the thread memory usage is cumulative not
separate, despite the display of the "top" command.  So you don't have
57*64, but 57 threads sharing 64MB.  This has been discussed on this
list many times: search the archives for more info on linux memory
usage.

>of memory" error. Perhaps that is what I am noticing. I still do not
>know why I am getting it though.

Your profiler should tell you where memory is allocated.

>> Could it be that you simply need more than the default 64MB memory?
>I don't know. The machine has 1GB of physical and 2GB virtual memory.
>With the total of java processes at 64MB each and 57 of them running,
my
>machine is almost depleated.

See above regarding interpretation of thread memory usage.

The heap is shared among all threads in the JVM, so what you specify as
-Xmx is not per-thread and your machine physical memory will not be
depleted.

Yoav Shapira
Millennium ChemInformatics

--
To unsubscribe, e-mail:   
For additional commands, e-mail: 




RE: Memory issues

2002-12-12 Thread Leo Przybylski
On Thu, 2002-12-12 at 07:07, Shapira, Yoav wrote:
> Howdy,
> Are you saying garbage collection never happens?  Or that it never
> de-allocates anything?  
Actually, it seems to do both according to my profiling software (Using
Inprise Optimizeit). I can see when the garbage collector runs, and when
it does all the objects are reclaimed and it deallocates, but my
resident memory keeps rising.

> The overall resident process size will never
> decrease, only increase up to the limit you (sort of) specify using
> parameters like -Xmx.  Java will allocate memory as needed up to that
> limit, and release it back to the heap (but not to the OS) during
> garbage collection.  
Ahhh...this helps. I believe my limit is 128m. I noticed that each of my
java processes (individual java threads) comes near that before the "out
of memory" error. Perhaps that is what I am noticing. I still do not
know why I am getting it though.

> 
> Could it be that you simply need more than the default 64MB memory?
I don't know. The machine has 1GB of physical and 2GB virtual memory.
With the total of java processes at 64MB each and 57 of them running, my
machine is almost depleated.

> 
> Yoav Shapira
> Millennium ChemInformatics
> 
> 
> >-Original Message-
> >From: Leo Przybylski [mailto:[EMAIL PROTECTED]]
> >Sent: Wednesday, December 11, 2002 7:18 PM
> >To: [EMAIL PROTECTED]
> >Subject: Memory issues
> >
> >Hello,
> >
> >I'm having memory issues. I don't suppose they come from tomcat, but
> >maybe it has to do with something I am using or the way I am doing it.
> >
> >It seems whenever I execute some (any) functionality of my system in
> >tomcat, memory gets used but never returned. Eventually, I get an "Out
> >of memory" error. I have run code profiling tools on my system and can
> >see that all my code is getting cleaned up. Somehow my operating system
> >is reporting I am using more and more each time. I wonder if it is the
> >way I am implementing some third part tools or if it might be the jvm I
> >am using.
> >
> >Has anybody had issues like this with jdk1.3.1_03,
> >jakarta-tomcat-4.1.12, log4j, and lucene-1.2?
> >
> >Thanks in advanced,
> >-Leo
> >
> >--
> >To unsubscribe, e-mail:    >[EMAIL PROTECTED]>
> >For additional commands, e-mail:  >[EMAIL PROTECTED]>
> 
> 
> --
> To unsubscribe, e-mail:   
> For additional commands, e-mail: 
> 

--
To unsubscribe, e-mail:   
For additional commands, e-mail: 




RE: Memory issues

2002-12-12 Thread Shapira, Yoav
Howdy,
Are you saying garbage collection never happens?  Or that it never
de-allocates anything?  The overall resident process size will never
decrease, only increase up to the limit you (sort of) specify using
parameters like -Xmx.  Java will allocate memory as needed up to that
limit, and release it back to the heap (but not to the OS) during
garbage collection.  

Could it be that you simply need more than the default 64MB memory?

Yoav Shapira
Millennium ChemInformatics


>-Original Message-
>From: Leo Przybylski [mailto:[EMAIL PROTECTED]]
>Sent: Wednesday, December 11, 2002 7:18 PM
>To: [EMAIL PROTECTED]
>Subject: Memory issues
>
>Hello,
>
>I'm having memory issues. I don't suppose they come from tomcat, but
>maybe it has to do with something I am using or the way I am doing it.
>
>It seems whenever I execute some (any) functionality of my system in
>tomcat, memory gets used but never returned. Eventually, I get an "Out
>of memory" error. I have run code profiling tools on my system and can
>see that all my code is getting cleaned up. Somehow my operating system
>is reporting I am using more and more each time. I wonder if it is the
>way I am implementing some third part tools or if it might be the jvm I
>am using.
>
>Has anybody had issues like this with jdk1.3.1_03,
>jakarta-tomcat-4.1.12, log4j, and lucene-1.2?
>
>Thanks in advanced,
>-Leo
>
>--
>To unsubscribe, e-mail:   [EMAIL PROTECTED]>
>For additional commands, e-mail: [EMAIL PROTECTED]>


--
To unsubscribe, e-mail:   
For additional commands, e-mail: