Scott Carey wrote:
How much longer do the tests take with that setting?
It doesn't seem markedly longer.
Also, if destroying the JVM makes it work, then there is probably some sort of
leak in certain tests
Yes, something seems to be leaking.
I'm fairly sure that running out of file
Scott Carey wrote:
Yes, I see the same thing.
Adding maxmemory=128m
to the junit ant task fixed it for me.
I wonder if what we're actually seeing is an exhaustion of file
descriptors due to some leak. As I recall, this also shows up as an OOM.
Another way to fix this, rather than adding
Anyone else seeing OutOfMemory errors when trying to run the hadoop tests?
Specifically:
[junit] Running org.apache.avro.mapred.TestWordCountGeneric
[junit] Apr 3, 2010 5:57:56 PM org.apache.hadoop.metrics.jvm.JvmMetrics init
[junit] INFO: Initializing JVM Metrics with
Yes, I see the same thing.
Adding maxmemory=128m
to the junit ant task fixed it for me. Alternatively we may be able to turn
down some of the Hadoop memory usage parameters such as sort and in memory fs
space.
I see some other errors:
RAT fails, with:
Unapproved licenses: