1. Looking at IFile$Reader#nextRawValue, not sure why we create valBytes
array of size 2 * currentValueLength even though it tries to read data of
currentValueLength size.
If there is no reason, this can be fixed which will fix the problem.
public void nextRawValue(DataInputBuffer value) throws
I ran into an issue and am struggling to find a way around it. I have a job
failing with the following output (version 2.7.0 of hadoop):
2019-09-04 13:20:30,026 DEBUG [main]
org.apache.hadoop.mapred.MapRFsOutputBuffer:
MapId=attempt_1567541971569_2612_m_003447_0 Reducer=133Spill