Ohh, thank you loan Eugen stan and Doug. I forgot to notice that, as i had
been working with programs , where they had to be null.
 Thank you.

On Sat, Jan 28, 2012 at 7:05 PM, Doug Meil <doug.m...@explorysmedical.com>wrote:

>
> In addition, see...
>
> http://hbase.apache.org/book.html#mapreduce.example
>
>
>
>
>
> On 1/28/12 6:43 AM, "Ioan Eugen Stan" <stan.ieu...@gmail.com> wrote:
>
> >2012/1/28 Vamshi Krishna <vamshi2...@gmail.com>:
> >> Hi, here i am trying to read rows from a table, and put them to a file
> >>as
> >> it is.For that my mapper class and run method are as shown below.
> >>(Correct
> >> me if any thing wrong).
> >>
> >> public static class SIMapper extends
> >> TableMapper<ImmutableBytesWritable,Text> {
> >>        Configuration config=HBaseConfiguration.create();
> >>
> >>        private Text TABLE=new Text("HS3");
> >>
> >>        public void map(ImmutableBytesWritable row, Text value, Context
> >> context) throws IOException {
> >>
> >>               try {
> >>            context.write(row, TABLE);
> >>               } catch (InterruptedException e) {
> >>            throw new IOException(e);
> >>            }
> >>
> >>        }
> >>
> >>      }
> >>
> >>
> >> Run method:---
> >>
> >> public int run(String[] args) throws Exception {
> >>
> >>         Job job = new Job(getConf());
> >>         job.setJobName("Job-1");
> >>         job.setJarByClass(setjar.class);
> >>
> >>         Scan s=new Scan();
> >>         s.setCacheBlocks(false);
> >>         s.setCaching(1000);
> >>
> >>         TableMapReduceUtil.initTableMapperJob(args[0],s, SIMapper.class,
> >> null,null, job);  // args[0] is the table name, which is the input table
> >> for mapper.
> >>         TableMapReduceUtil.addDependencyJars(job);
> >>
> >>         FileOutputFormat.setOutputPath(job, new Path(args[1]));
> >>
> >>
> >>        return job.waitForCompletion(true)? 0:1 ;
> >>       }
> >>
> >> When i try to execute the job, i am getting following error. I don't
> >>know
> >> what is the mistake.For what the LongWritable it is expected?
> >>
> >> 12/01/28 11:57:23 INFO mapred.JobClient:  map 0% reduce 0%
> >> 12/01/28 11:57:34 INFO mapred.JobClient: Task Id :
> >> attempt_201201281010_0004_m_000000_0, Status : FAILED
> >> java.io.IOException: Type mismatch in key from map: expected
> >> org.apache.hadoop.io.LongWritable, recieved
> >> org.apache.hadoop.hbase.io.ImmutableBytesWritable
> >>    at
> >>
> >>org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:845
> >>)
> >>    at
> >>
> >>org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:54
> >>1)
> >>    at
> >>
> >>org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputC
> >>ontext.java:80)
> >>    at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)
> >>    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >>    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
> >>    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> >>    at org.apache.hadoop.mapred.Child.main(Child.java:170)
> >>
> >> please some body help..
> >>
> >> --
> >> *Regards*
> >> *
> >> Vamshi Krishna
> >> *
> >
> >You have to replace the two null values from initTableMapper with the
> >classes that you have for key and value. Please see javadoc for
> >TableMapReduceUtil class [1].
> >
> >[1]
> >
> http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableMap
> >ReduceUtil.html
> >--
> >Ioan Eugen Stan
> >http://ieugen.blogspot.com/
> >
>
>
>


-- 
*Regards*
*
Vamshi Krishna
*

Reply via email to