The problem is that your garbage collector has maxed out. One of the ways I got 
around this was to reduce your datasets in the query that your running. 
Increasing limits is a temporary solution and eventually it will be hit.

Thanks,
Gazza
From: Daniel Lopes [mailto:dan...@bankfacil.com.br]
Sent: Tuesday, February 23, 2016 9:32 AM
To: user@hive.apache.org
Subject: hive memory error: GC overhead limit exceeded

Hi,

Anyone know this error? running at Amazon EMR.

2016-02-19 10:32:34 Starting to launch local task to process map join; maximum 
memory = 932184064
#
# java.lang.OutOfMemoryError: GC overhead limit exceeded
# -XX:OnOutOfMemoryError="kill -9 %p
kill -9 %p"
#   Executing /bin/sh -c "kill -9 15759
kill -9 15759"...
Execution failed with exit status: 137
Obtaining error information
Task failed!
Task ID:
  Stage-35
Logs:
/var/log/hive/user/hadoop/hive.log


Best,

--
Daniel Lopes, B.Eng
Data Scientist - BankFacil
CREA/SP 
5069410560<http://edital.confea.org.br/ConsultaProfissional/cartao.aspx?rnp=2613651334>
Mob +55 (18) 99764-2733<callto:+5518997642733>
Ph +55 (11) 3522-8009
http://about.me/dannyeuu

Av. Nova Independência, 956, São Paulo, SP
Bairro Brooklin Paulista
CEP 04570-001
https://www.bankfacil.com.br<https://www.bankfacil.com.br/>

Reply via email to