Hello Koji, Thanks for your email. Is there a M45 specific mailing list. That would be really beneficial.
I tried the hadoop queue command and it gives me Queue acls for user : sgrao Queue Operations ===================== m45 submit-job need I specify this queue name in my properties? or modify the mapred-queue-acls.xml file. I do not think I have the authorization to do so. I even ran a info on the queue name I found "m45" $hadoop queue -info m45 Queue Name : m45 Scheduling Info : Queue configuration Capacity Percentage: 90.0% User Limit: 20% Priority Supported: NO ------------- Map tasks Capacity: 684 slots Used capacity: 400 (58.5% of Capacity) Running tasks: 200 Active users: User 'scohen': 400 (100.0% of used capacity) ------------- Reduce tasks Capacity: 342 slots Used capacity: 5 (1.5% of Capacity) Running tasks: 5 Active users: User 'ukang': 5 (100.0% of used capacity) ------------- Job info Number of Waiting Jobs: 2 Number of users who have submitted jobs: 2 How do I request for a queue. I tried adding -Dmapred.queue.name to the end of the hadoop command hadoop jar /grid/0/gs/hadoop/current/hadoop-examples.jar pi 10 10000 -Dmapred.queue.name=m45 It gives me a usage error, as if I cannot specify the queue or I am using the wrong syntax. I have not been able to find the right syntax. So I am not sure how to specify the queue name or request for a queue. Regards, Shivani ----- Original Message ----- From: "Koji Noguchi" <[email protected]> To: "Shivani Rao" <[email protected]>, "Tim Korb" <[email protected]> Cc: "Viraj Bhat" <[email protected]>, "Avinash C Kak" <[email protected]>, [email protected], [email protected] Sent: Monday, February 14, 2011 1:12:49 PM Subject: Re: Problem with running the job, no default queue Hi Shivani, You probably don’t want to ask m45 specific questions on hadoop.apache mailing list. Try % hadoop queue –showacls It should show which queues you’re allowed to submit. If it doesn’t give you any queues, you need to request one. Koji On 2/9/11 9:10 PM, "Shivani Rao" < [email protected] > wrote: Tried a simple example job with Yahoo M45. The job fails for non-existence of a default queue. Output is attached as below. From the Apache hadoop mailing list, found this post (specific to M45), that attacked this problem by setting the property Dmapred.job.queue.name=*myqueue* ( http://web.archiveorange.com/archive/v/3inw3ySGHmNRR9Bm14Uv ) There is also documentation set for capacity schedulers, but I do not have write access to the files in conf directory, so I do not know how I can set the capacity schedulers there. I am also posting this question on the general lists, just in case. $hadoop jar /grid/0/gs/hadoop/current/hadoop-examples.jar pi 10 10000 Number of Maps = 10 Samples per Map = 10000 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Wrote input for Map #5 Wrote input for Map #6 Wrote input for Map #7 Wrote input for Map #8 Wrote input for Map #9 Starting Job 11/02/10 04:19:22 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 83705 for sgrao 11/02/10 04:19:22 INFO security.TokenCache: Got dt for hdfs://grit-nn1.yahooresearchcluster.com/user/sgrao/.staging/job_201101150035_26053;uri=68.180.138.10:8020;t.service=68.180.138.10:8020 11/02/10 04:19:22 INFO mapred.FileInputFormat: Total input paths to process : 10 11/02/10 04:19:23 INFO mapred.JobClient: Cleaning up the staging area hdfs://grit-nn1.yahooresearchcluster.com/user/sgrao/.staging/job_201101150035_26053 org.apache.hadoop.ipc.RemoteException: java.io.IOException: Queue "default" does not exist at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3680) at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:523) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1301) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1297) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1062) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1295) at org.apache.hadoop.ipc.Client.call(Client.java:951) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:223) at org.apache.hadoop.mapred.$Proxy6.submitJob(Unknown Source) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:818) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:752) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1062) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:752) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:726) at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1156) at org.apache.hadoop.examples.PiEstimator.estimate(PiEstimator.java:297) at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:342) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:351) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
