That was it! Thank you.
On Friday, June 6, 2014 3:12:39 PM UTC-7, Costin Leau wrote: > > By the way, quickly looking at your class, it's likely because your > mapper/reducer are defined at inner classes yet they are not static and > thus cannot be used without their enclosing class. In other words, declare > them as 'static'. > > > On Sat, Jun 7, 2014 at 1:00 AM, Costin Leau <costi...@gmail.com > <javascript:>> wrote: > >> " >> Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodException: >> com.edcast.cards.MapReduceHelloWorld$SomeMapper.<init>() >> at >> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131) >> at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) >> ... 15 more >> Caused by: java.lang.NoSuchMethodException: >> com.edcast.cards.MapReduceHelloWorld$SomeMapper.<init>() >> at java.lang.Class.getConstructor0(Class.java:2810) >> at java.lang.Class.getDeclaredConstructor(Class.java:2053) >> at >> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)" >> >> In other words, your Mapper has no default constructor defined for it. >> That is Hadoop cannot instantiate your mapper since your class definition >> is incorrect. >> >> >> On Sat, Jun 7, 2014 at 12:52 AM, bharath bhat <bharat...@gmail.com >> <javascript:>> wrote: >> >>> Hi, >>> >>> I'm trying to get a simple mapreduce job working with ES hadoop. I >>> followed the docs to set up a job but I keep getting 'Error in configuring >>> object' when I try to run it on Hadoop in pseudo distributed mode. I am >>> using the old API with Hadoop 2.4.0. >>> >>> Here's my code I am using: >>> >>> public class SomeMapper extends MapReduceBase implements Mapper<Object, >>> Object, Text, MapWritable> { >>> >>> // Dummy >>> public void map(Object key, Object value, OutputCollector<Text, >>> MapWritable> output, Reporter reporter) >>> throws IOException { >>> Text docId = (Text) key; >>> MapWritable doc = (MapWritable) value; >>> output.collect(docId, doc); >>> } >>> } >>> >>> public class SomeReducer extends MapReduceBase implements >>> Reducer<Text, MapWritable, Text, MapWritable> { >>> >>> // Dummy >>> public void reduce(Text key, Iterator<MapWritable> values, >>> OutputCollector<Text, MapWritable> output, Reporter reporter) >>> throws IOException { >>> Text docId = (Text) key; >>> while (values.hasNext()){ >>> MapWritable out = (MapWritable)(values.next()); >>> output.collect(docId, out); >>> } >>> } >>> } >>> >>> public static void main(String[] args) throws Exception { >>> >>> JobConf conf = new JobConf(); >>> conf.setJobName("elastic search hello world"); >>> conf.setSpeculativeExecution(false); >>> >>> conf.set("es.nodes", "localhost:9200"); >>> conf.set("es.resource", "answers_development/answer"); >>> conf.set("es.resource.read", "answers_development/answer"); >>> conf.set("es.resource.write", >>> "questions_development_20140603205720870/question"); >>> conf.set("es.query", "{}"); >>> >>> conf.setOutputFormat(EsOutputFormat.class); >>> conf.setInputFormat(EsInputFormat.class); >>> >>> conf.setMapOutputKeyClass(Text.class); >>> conf.setMapOutputValueClass(MapWritable.class); >>> >>> conf.setMapperClass(SomeMapper.class); >>> conf.setReducerClass(SomeReducer.class); >>> >>> JobClient.runJob(conf); >>> >>> }; >>> >>> Here's the relevant portion of the stack trace: >>> >>> 14/06/06 21:35:27 INFO mapred.MapTask: Map output collector class = >>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer >>> 14/06/06 21:35:28 INFO mapred.LocalJobRunner: map task executor complete. >>> 14/06/06 21:35:28 WARN mapred.LocalJobRunner: job_local141518769_0001 >>> java.lang.Exception: java.lang.RuntimeException: Error in configuring >>> object >>> at >>> org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) >>> at >>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) >>> Caused by: java.lang.RuntimeException: Error in configuring object >>> at >>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) >>> at >>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) >>> at >>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) >>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:426) >>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342) >>> at >>> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) >>> at >>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>> at >>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>> at java.lang.Thread.run(Thread.java:744) >>> Caused by: java.lang.reflect.InvocationTargetException >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> at >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>> at >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>> at java.lang.reflect.Method.invoke(Method.java:606) >>> at >>> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) >>> ... 10 more >>> Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodException: >>> com.edcast.cards.MapReduceHelloWorld$SomeMapper.<init>() >>> at >>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131) >>> at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) >>> ... 15 more >>> Caused by: java.lang.NoSuchMethodException: >>> com.edcast.cards.MapReduceHelloWorld$SomeMapper.<init>() >>> at java.lang.Class.getConstructor0(Class.java:2810) >>> at java.lang.Class.getDeclaredConstructor(Class.java:2053) >>> at >>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125) >>> ... 16 more >>> 14/06/06 21:35:28 INFO mapreduce.Job: Job job_local141518769_0001 failed >>> with state FAILED due to: NA >>> 14/06/06 21:35:28 INFO mapreduce.Job: Counters: 0 >>> Exception in thread "main" java.io.IOException: Job failed! >>> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836) >>> at >>> com.edcast.cards.MapReduceHelloWorld.main(MapReduceHelloWorld.java:79) >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> at >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>> at >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>> at java.lang.reflect.Method.invoke(Method.java:606) >>> at org.apache.hadoop.util.RunJar.main(RunJar.java:212) >>> >>> >>> I am able to run Hadoop MR example jars, so I'm guessing there's >>> something wrong with the ES integration. I'm new to ES Hadoop, apologies if >>> I've missed something basic. Any help would be great. Thanks! >>> >>> Bharath >>> >>> -- >>> You received this message because you are subscribed to the Google >>> Groups "elasticsearch" group. >>> To unsubscribe from this group and stop receiving emails from it, send >>> an email to elasticsearc...@googlegroups.com <javascript:>. >>> To view this discussion on the web visit >>> https://groups.google.com/d/msgid/elasticsearch/b76494ce-20d0-496d-a500-181cd4b75537%40googlegroups.com >>> >>> <https://groups.google.com/d/msgid/elasticsearch/b76494ce-20d0-496d-a500-181cd4b75537%40googlegroups.com?utm_medium=email&utm_source=footer> >>> . >>> For more options, visit https://groups.google.com/d/optout. >>> >> >> > -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/78c31177-66d9-4133-aa3d-d06f66d19cc2%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.