Hi,
      You could also use apache commons logging to write logs in your
map/reduce functions which will be seen in the jobtracker UI.
that's how we did debugging :)

Hope it helps
Regards,
Raakhi


On Tue, Jun 16, 2009 at 7:29 PM, jason hadoop <jason.had...@gmail.com>wrote:

> When you are running in local mode you have 2 basic choices if you want to
> interact with a debugger.
> You can launch from within eclipse or other IDE, or you can setup a java
> debugger transport as part of the mapred.child.java.opts variable, and
> attach to the running jvm.
> By far the simplest is loading via eclipse.
>
> Your other alternative is to inform the framework to retain the job files
> via keep.failed.task.files (be careful here you will fill your disk with
> old
> dead data) and use the debug the IsolationRunner
>
> Examples in my book :)
>
>
> On Mon, Jun 15, 2009 at 6:49 PM, bharath vissapragada <
> bharathvissapragada1...@gmail.com> wrote:
>
> > I am running in a local mode . Can you tell me how to set those
> breakpoints
> > or how to access those files so that i can debug the program.
> >
> > The program is generating  => java.lang.NumberFormatException: For input
> > string: ""
> >
> > But that particular string is the one which is the input to the mapclass
> .
> > So I think that it is not reading my input correctly .. But when i try to
> > print the same .. it isn't printing to the STDOUT ..
> > Iam using the FileInputFormat class
> >
> >  FileInputFormat.addInputPath(conf, new
> > Path("/home/rip/Desktop/hadoop-0.18.3/input"));
> > FileOutputFormat.setOutputPath(conf, new
> > Path("/home/rip/Desktop/hadoop-0.18.3/output"));
> >
> > input and output are folders for inp and outpt.
> >
> > It is generating these warnings also
> >
> > 09/06/16 12:38:32 WARN fs.FileSystem: "local" is a deprecated filesystem
> > name. Use "file:///" instead.
> >
> > Thanks in advance
> >
> >
> > On Tue, Jun 16, 2009 at 3:50 AM, Aaron Kimball <aa...@cloudera.com>
> wrote:
> >
> > > On Mon, Jun 15, 2009 at 10:01 AM, bharath vissapragada <
> > > bhara...@students.iiit.ac.in> wrote:
> > >
> > > > Hi all ,
> > > >
> > > > When running hadoop in local mode .. can we use "print" statements to
> > > print
> > > > something to the terminal ...
> > >
> > >
> > > Yes. In distributed mode, each task will write its stdout/stderr to
> files
> > > which you can access through the web-based interface.
> > >
> > >
> > > >
> > > > Also iam not sure whether the program is reading my input files ...
> If
> > i
> > > > keep print statements it isn't displaying any .. can anyone tell me
> how
> > > to
> > > > solve this problem.
> > >
> > >
> > > Is it generating exceptions? Are the files present? If you're running
> in
> > > local mode, you can use a debugger; set a breakpoint in your map()
> method
> > > and see if it gets there. How are you configuring the input files for
> > your
> > > job?
> > >
> > >
> > > >
> > > >
> > > > Thanks in adance,
> > > >
> > >
> >
>
>
>
> --
> Pro Hadoop, a book to guide you from beginner to hadoop mastery,
> http://www.amazon.com/dp/1430219424?tag=jewlerymall
> www.prohadoopbook.com a community for Hadoop Professionals
>

Reply via email to