Re: RE: Making Mumak work with capacity scheduler

2011-09-23 Thread arun k
Sorry , 1Q: In web GUI of Jobtracker i see both he queues but CAPACITIES ARE NOT REFLECTED 2Q:All the jobs by defaul are submitted to default queue. How can i submit jobs to various queues in mumak ? regards, Arun On Fri, Sep 23, 2011 at 11:57 AM, arun k arunk...@gmail.com wrote: Hi guys !

Running Hadoop in different modes

2011-09-23 Thread Praveen Sripati
Hi, What are the features available in the Fully-Distributed Mode and the Pseudo-Distributed Mode that are not available in the Local (Standalone) Mode? Local (Stanndalone) Mode is very fast and I am able get in run in Eclipse also. Thanks, Praveen

Re: Running Hadoop in different modes

2011-09-23 Thread Harsh J
Hello Praveen, Is your question from a test-case perspective? Cause otherwise is it not clear what you gain in 'Distributed' vs. 'Standalone'? On Fri, Sep 23, 2011 at 12:15 PM, Praveen Sripati praveensrip...@gmail.com wrote: Hi, What are the features available in the Fully-Distributed Mode

Re:Re: How do I set the intermediate output path when I use 2 mapreduce jobs?

2011-09-23 Thread 谭军
Swathi.V., ControlledJob cannot be resolved in my eclipse. My hadoop version is 0.20.2 ControlledJob can only be resolved in hadoop 0.21.0 (+)? Or I need some certain plugins? Thanks -- Regards! Jun Tan At 2011-09-22 00:56:54,Swathi V swat...@zinniasystems.com wrote: Hi, This code might

Re: Re: How do I set the intermediate output path when I use 2 mapreduce jobs?

2011-09-23 Thread Swathi V
Hi Jun Tan, Yes i use 0.21.0 version. So i have used those. Well the Hadoop Definitive Guide has job dependency examples for 0.20.x. Thank You, 2011/9/23 谭军 tanjun_2...@163.com Swathi.V., ControlledJob cannot be resolved in my eclipse. My hadoop version is 0.20.2 ControlledJob can only be

Re:Re: Re: How do I set the intermediate output path when I use 2 mapreduce jobs?

2011-09-23 Thread 谭军
Hi Swathi.V., I think my code below would work: Configuration conf1 = new Configuration(); Job job1 = new Job(conf1, Retrieval1); job1.setJarByClass(Retrieval.class); job1.addCacheFile(new URI(args[0])); // problem here conf1.set(keyNodeFile,

Re: Running Hadoop in different modes

2011-09-23 Thread Praveen Sripati
Harsh, I am exploring the different features of Hadoop and have setup the Stand alone mode in Eclipse and would like to know what features it does/doesn't supports. For some reason running Hadoop in distributed mode from Eclipse gives an exception. Thanks, Praveen On Fri, Sep 23, 2011 at 1:10

The method addCacheFIle(URI) is undefined for the type Job

2011-09-23 Thread 谭军
Hi, I encountered an error that I cannot understand. Configuration conf = new Configuration(); Job job = new Job(conf, job1); job.addCacheFile(new URI(args[0])); Why did it report The method addCacheFile(URI) is undefined for the type Job? Thanks! -- Regards! Jun Tan

Re: The method addCacheFIle(URI) is undefined for the type Job

2011-09-23 Thread Harsh J
Jun, Common cause is that your URI class is not the right import. It must be java.net.URI and not any other class. Fix this and your problem would go away. 2011/9/23 谭军 tanjun_2...@163.com:  Hi, I encountered an error that I cannot understand. Configuration conf = new Configuration(); Job

Re: Re: Re: How do I set the intermediate output path when I use 2 mapreduce jobs?

2011-09-23 Thread Swathi V
Hi JunTun, 1. Distributed Cache in new API usage: // Setting up the cache for the application 1. Copy the requisite files to the FileSystem: $ bin/hadoop fs -copyFromLocal lookup.dat /myapp/lookup.dat $ bin/hadoop fs -copyFromLocal map.zip /myapp/map.zip $ bin/hadoop fs

Hadoop java mapper -copyFromLocal heap size error

2011-09-23 Thread Joris Poort
As part of my Java mapper I have a command executes some code on the local node and copies a local output file to the hadoop fs. Unfortunately I'm getting the following output: Error occurred during initialization of VM Could not reserve enough space for object heap I've tried adjusting

Re:Re: The method addCacheFIle(URI) is undefined for the type Job

2011-09-23 Thread 谭军
Harsh, It is java.net.URI that is imported. -- Regards! Jun Tan At 2011-09-24 00:52:14,Harsh J ha...@cloudera.com wrote: Jun, Common cause is that your URI class is not the right import. It must be java.net.URI and not any other class. Fix this and your problem would go away. 2011/9/23

Re: Re: Re: The method addCacheFIle(URI) is undefined for the type Job

2011-09-23 Thread Joey Echeverria
I think the API call you're looking for is DistributedCache.addCacheFile(URI, Configuration) [1] -Joey [1] http://hadoop.apache.org/common/docs/r0.20.2/api/org/apache/hadoop/filecache/DistributedCache.html#addCacheFile(java.net.URI, org.apache.hadoop.conf.Configuration) 2011/9/23 谭军

Re:Re: Re: Re: The method addCacheFIle(URI) is undefined for the type Job

2011-09-23 Thread 谭军
Joey Echeverria, Yes, that works. I thought job.addCacheFile(new URI(args[0])); could run on hadoop-0.20.2. Because hadoop-0.20.2 could run context object. Thanks! -- Regards! Jun Tan At 2011-09-24 11:54:42,Joey Echeverria j...@cloudera.com wrote: I think the API call you're looking for