Hi,

I'm trying to create indexes in Hive, and I've switched
to using CDH-4. The creation of the index is failing and
it's pretty obvious that the reducers are running out of
heap space. When I use the web interface for the
"Hadoop reduce task list" I can find this entry:

Error: Java heap space
Error: GC overhead limit exceeded
org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException: EEXIST: File exists
        at 
org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:178)
        at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:303)
        at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:376)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
        at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: EEXIST: File exists
        at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
        at 
org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
        ... 7 more

Error: GC overhead limit exceeded

Al

If Also when the
If Also when the


If Also when the

If this e-mail shouldn't be here and should only be on
a cloudera mailing list, please re-direct me.

Thanks in advance.

Peter Marron
Trillium Software UK Limited

Tel : +44 (0) 118 940 7609
Fax : +44 (0) 118 940 7699
E: peter.mar...@trilliumsoftware.com<mailto:roy.willi...@trilliumsoftware.com>

Reply via email to