You need to add a call to MultipleOutputs.close() in your reducer's cleanup:
public void cleanup(Context) throws IOException {
mos.close();
...
}
On Fri, May 6, 2011 at 1:55 PM, Geoffry Roberts
wrote:
> All,
>
> I am attempting to take a large file and split it up into a series of
> smal
All,
I am attempting to take a large file and split it up into a series of
smaller files. I want the smaller files to be named based on values taken
from the large file. I am using
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs to do this.
The job runs without error and produces a set o