Re: issue map/reduce job to linux hadoop cluster from MS Windows, Eclipse

2008-12-14 Thread Aaron Kimball
Songting,

If you set mapred.job.tracker to jobtrackeraddr:9001 and
fs.default.nameto hdfs://hdfsservername:9000/ in the conf, it will
connect to the remote
server and run the job there. The trick is that you'll need to do this from
a jar file, not a series of .class files as Eclipse generates by default.
- Aaron

On Sat, Dec 13, 2008 at 7:07 PM, Songting Chen ken_cst1...@yahoo.comwrote:

 Is it possible to do that?

 I can access files at HDFS by specifying the URI below.
 FileSystem fileSys = FileSystem.get(new URI(hdfs://server:9000), conf);

 But I don't know how to do that for JobConf.

 Thanks,
 -Songting



issue map/reduce job to linux hadoop cluster from MS Windows, Eclipse

2008-12-13 Thread Songting Chen
Is it possible to do that?

I can access files at HDFS by specifying the URI below.
FileSystem fileSys = FileSystem.get(new URI(hdfs://server:9000), conf); 

But I don't know how to do that for JobConf.

Thanks,
-Songting