There is some static overhead which may already exceeds your heap
size, no matter how simple your script is. Try to increase the heap
size. Here are common scenario:

1. ant: export ANT_OPTS=-Xmx1024m
2. pig script: export PIG_HEAPSIZE=1024
3. java: add "-Xmx1024m" to command line
4. maven: export MAVEN_OPTS=”-Xmx1024m”

Daniel

On Thu, Aug 11, 2011 at 1:52 AM, Sotiris Matzanas
<[email protected]> wrote:
> Hi all
>
> first, sorry if this has been asked before, but could not find any reference
> in the list archives.
>
> I have tried to run the PigUnit example (top_queries.pig) provided on
> http://pig.apache.org/docs/r0.8.1/pigunit.html on my 64-bit MacOS 10.6.
>
> I have built and deployed the 0.9.0 jars via maven but I get:
>
> ...
> 11/08/11 10:43:10 INFO mapred.MapTask: io.sort.mb = 100
> 11/08/11 10:43:10 INFO mapReduceLayer.MapReduceLauncher: HadoopJobId:
> job_local_0001
> 11/08/11 10:43:10 INFO mapReduceLayer.MapReduceLauncher: 0% complete
> 11/08/11 10:43:10 WARN mapred.LocalJobRunner: job_local_0001
> java.lang.OutOfMemoryError: Java heap space
> at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:781)
>  at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:524)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:613)
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)
> 11/08/11 10:43:15 INFO mapReduceLayer.MapReduceLauncher: job job_local_0001
> has failed! Stop running all dependent jobs
> 11/08/11 10:43:15 INFO mapReduceLayer.MapReduceLauncher: 100% complete
> 11/08/11 10:43:15 ERROR pigstats.PigStatsUtil: 1 map reduce job(s) failed!
> 11/08/11 10:43:15 INFO pigstats.SimplePigStats: Detected Local mode. Stats
> reported below may be incomplete
> 11/08/11 10:43:15 INFO pigstats.SimplePigStats: Script Statistics:
>
> I find it very hard to believe that I have run out of heap space processing
> 10 lines of input. I have searched and searched online and there was one
> entry that seemed to indicate that the issue "may" be MacOS specific but no
> solution or work around given.
> Any input on this highly appreciated.
>
> Cheers
> Sotos
>

Reply via email to