@Ted: thanks for the suggestion.

Maybe I should have worded my question differently. I am interested in the
actual amount of memory available on Hadoop QA machines, because I see
out-of-memory errors in native memory allocation (not part of Java heap)
that only happen in Hadoop QA.

Perhaps we should define a "reference configuration" for HBase test suite.
E.g. do we expect all unit tests to pass on a 2 GB box, a 4 GB box, etc.?

Thanks,
--Mikhail

On Fri, Feb 10, 2012 at 12:50 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> This should do:
>
> Index: pom.xml
> ===================================================================
> --- pom.xml    (revision 1242915)
> +++ pom.xml    (working copy)
> @@ -350,7 +350,7 @@
>
>           <configuration>
>
> <forkedProcessTimeoutInSeconds>900</forkedProcessTimeoutInSeconds>
> -            <argLine>-enableassertions -Xmx1900m
> -Djava.security.egd=file:/dev/./urandom</argLine>
> +            <argLine>-d32 -enableassertions -Xmx2300m
> -Djava.security.egd=file:/dev/./urandom</argLine>
>             <redirectTestOutputToFile>true</redirectTestOutputToFile>
>           </configuration>
>         </plugin>
>
> On Fri, Feb 10, 2012 at 12:48 PM, Mikhail Bautin <
> bautin.mailing.li...@gmail.com> wrote:
>
> > Hello,
> >
> > Does anyone know how to increase heap allocation for Hadoop QA runs, or
> at
> > least check the available amount of memory?
> >
> > Thanks,
> > --Mikhail
> >
>

Reply via email to