I don't believe it's actually cleaned out then. Hadoop thinks the temp
directory exists from a previous run, which perhaps failed. Make sure it is
deleted in HDFS. This is, at least, what the error is trying to tell you.
Are you running two jobs that might both want this directory?

2011/8/10 Clément Notin <clement.no...@gmail.com>

> Yes I agree it's ugly ;)
>
> I tried with the params
> "org.apache.mahout.cf.taste.hadoop.item.RecommenderJob
> -Dmapred.input.dir=mb-recouser-input/input.csv
> -Dmapred.output.dir=mb-recouser-output/reco.csv --numRecommendations 3
> --booleanData true --similarityClassname SIMILARITY_EUCLIDEAN_DISTANCE" (of
> course I split them).
>
> But I'm getting an error :
>  INFO [2011-08-10 14:52:05,195] (JobClient.java:871) - Cleaning up the
> staging area
>
> file:/tmp/hadoop-clement/mapred/staging/clement1957523084/.staging/job_local_0001
> org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
> temp/itemIDIndex already exists
>
> Even if I clean before the /tmp/hadoop-clement/ folder...
> And it don't seems to run on the cluster.
>
> 2011/8/10 Sean Owen <sro...@gmail.com>
>
> > You could just run the main() method with an array of the same arguments
> > you
> > passed on the command line. It's a little ugly but entirely works.
> >
> > 2011/8/10 Clément Notin <clement.no...@gmail.com>
> >
> > > Hello,
> > >
> > > I've achieved to run a recommender over hadoop using the command line
> > > /bin/mahout org.apache.mahout.cf.taste.hadoop.item.RecommenderJob
> --input
> > > .....
> > > I'm happy with it but now I want to launch this using Java.
> > >
> > > What is the easiest way to do this ? I tried to run the MahoutDriver
> but
> > it
> > > runs locally however I want to launch the job on an hadoop cluster.
> > >
> > > Regards.
> > >
> > > --
> > > *Clément **Notin*
> > >
> >
>
>
>
> --
> *Clément **Notin*
>

Reply via email to