Thanks!
On Thursday, September 12, 2013 9:39:16 AM UTC-7, Thomas Mueller wrote:
>
> Hi,
>
> For memory problems:
>
> What I usually first use is "jmap -histo " to get an overview. It
> usually works with recent versions of Java, sometimes you have to try a few
> times and sometimes you have to u
Hi,
For memory problems:
What I usually first use is "jmap -histo " to get an overview. It
usually works with recent versions of Java, sometimes you have to try a few
times and sometimes you have to use -F. Rarely you have to use -histo:live.
Afterwards, if I need to further analyze, I use
-XX:+
On 2013-09-12 06:03, Brian Craft wrote:
Are there any better tools for jvm memory debugging?
I use the Eclipse Memory Analysis tool (MAT) as my first port of call,
it's pretty good.
Even has a nice summary report.
--
You received this message because you are subscribed to the Google Groups
Good job tracking it down.
Regarding the tools for debugging. No ideas from me sorry, I need to get
better at using those kinds of things.
On 12/09/2013 12:03 PM, Brian Craft wrote:
Following up on this, after resolving the memory problems in my code,
h2 handles my 1G transaction.
Tracking
Following up on this, after resolving the memory problems in my code, h2
handles my 1G transaction.
Tracking down the memory problems was enormously painful. jmap/jhat
provided better information than visualvm, but they were unreliable: jhat
would randomly refuse to read the heap dumps generate
I do it with a bunch of inserts. The tables already exist. It happens even
if I use UNDO_LOG=0, which I think disables the undo log.
I'll try jmap.
On Tuesday, September 10, 2013 11:50:43 AM UTC-7, Thomas Mueller wrote:
>
> Hi,
>
> It depends on how you do the import. Could you get a heap histog
Well, that was way more informative than visualvm. The top three items,
amounting to 2G:
num #instances #bytes class name
--
1: 70466 1104237992 [C
2: 45338856 1088132544 java.lang.Float
3: 1544435
Hi,
It depends on how you do the import. Could you get a heap histogram for a
large import (jmap -histo:live ), and could you tell us how you do the
import exactly?
If you use "create table ... as select" then it can't run out of memory (as
far as I know), because no undo log is needed.
If you e
Hm, I have no idea. Does h2 hold a whole transaction in memory? If you've
had success with 1G transactions perhaps there's something different about
my schema (larger indexes?), or perhaps I have a leak of some sort that is
resolved by breaking it into smaller transactions. I could try again w/o
I've ended up using a boolean "loaded" column, which basically works: set
to true when all transactions have succeeded.
On Sep 10, 2013 1:32 AM, "Noel Grandin" wrote:
>
> On 2013-09-06 20:15, Brian Craft wrote:
>
>> I need to load about 1G of data into an existing db, while maintaining
>> data co
Howcome it is getting OOM on import? It shouldn't do that should it?
I've imported several GB of data before, I did get OOM, but increasing
heap size to around 1GB worked for me. I didn't need to go to crazy sizes...
On 10/09/2013 4:32 PM, Noel Grandin wrote:
On 2013-09-06 20:15, Brian Craft
On 2013-09-06 20:15, Brian Craft wrote:
I need to load about 1G of data into an existing db, while maintaining
data coherence. Wrapping the inserts in one transaction results in
out-of-memory problems in the jvm. I increased the max heap size to 8g
w/o improvement. I can split it into a bunch
I need to load about 1G of data into an existing db, while maintaining data
coherence. Wrapping the inserts in one transaction results in out-of-memory
problems in the jvm. I increased the max heap size to 8g w/o improvement. I
can split it into a bunch of smaller commits, which works fine, but
13 matches
Mail list logo