I think you also need to stop/kill the process which submit the RecommenderJob 
to hadoop too.
RegardsRamon

> Date: Mon, 2 Apr 2012 19:05:27 +0100
> Subject: Re: Cancel running distributed RecommenderJob
> From: sro...@gmail.com
> To: user@mahout.apache.org
> 
> You can use the Hadoop interface itself (like, the command-line hadoop
> tool) to kill a job by its ID. If you kill one MapReduce job the
> entire process should halt after that.
> 
> On Mon, Apr 2, 2012 at 6:44 PM, Sören Brunk <soren.br...@deri.org> wrote:
> > Hi,
> >
> > I'm using the distributed RecommenderJob from within a Java program.
> > For that, in a separate thread, I'm creating a RecommenderJob object, call
> > setConf() for the hadoop configuration and then run() with the job
> > parameters.
> > This is working fine for me but now I would like to be able to stop a
> > running job.
> > Not sure if that's possible at all since RecommenderJob encapsulates several
> > Hadoop jobs (or even other Mahout jobs that call Hadoop in turn) and runs
> > them in a blocking way.
> >
> > Would be interesting for other Mahout jobs as well.
> > Any ideas?
> >
> > Thanks,
> >
> > --
> > Sören Brunk
> > Research Assistant
> > Data Intensive Infrastructures Unit (DI2)
> > Digital Enterprise Research Institute
> > National University of Ireland Galway
> >
                                          

Reply via email to