Yuanbo Liu created MAPREDUCE-7220:
-------------------------------------

             Summary: Mapreduce jobhistory summary error if job name is very 
long
                 Key: MAPREDUCE-7220
                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-7220
             Project: Hadoop Map/Reduce
          Issue Type: Improvement
            Reporter: Yuanbo Liu


>From JobHistoryEventHandler.java, we can see that mapreduce uses writeUTF to 
>write summary.done file to hdfs. The code is here:
{quote}summaryFileOut = doneDirFS.create(qualifiedSummaryDoneFile, true);
summaryFileOut.writeUTF(mi.getJobSummary().getJobSummaryString());
summaryFileOut.close();
{quote}
writeUTF uses first two bytes to record string length, hence the length of 
summary string cannot exceed 65535. But in the case of hive job, SQL string is 
part of job name. It's quite normal that SQL length is greater than 65535, then 
summary done file cannot be written successfully. In this case, hive client 
thinks such kind of mapreduce job is in the final state of failure sometimes.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: mapreduce-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: mapreduce-dev-h...@hadoop.apache.org

Reply via email to