Combiners are supposed to emit exactly what they receive. Hadoop
should look at what the combiner spits out and blow up.

On Thu, Aug 18, 2011 at 9:32 PM, Harsh J <ha...@cloudera.com> wrote:
> Yep, as John points out, your trouble is related to what your
> reducer/combiner is emitting, from your snippet below:
>
> On Fri, Aug 19, 2011 at 2:14 AM, vipul sharma <sharmavipulw...@gmail.com> 
> wrote:
>>     @Override
>>     public void reduce(Text key, Iterable<IntArrayWritable> values, Context
>> context) throws IOException, InterruptedException {
>>            context.write(key, new Text(some_stuff));
>
> Since these emits would occur on the map side, you lose the
> ArrayWritable typing there cause your value has now become Text.
>
> --
> Harsh J
>



-- 
Lance Norskog
goks...@gmail.com

Reply via email to