Re: Distibuted Shell example in capacity Scheduler

2012-04-01 Thread raghavendhra rahul
Any Ideas?

On Tue, Mar 27, 2012 at 2:52 PM, raghavendhra rahul <
raghavendhrara...@gmail.com> wrote:

> Hi,
> When i tried to run randomwriter example in capacity scheduler
> it works fine.But when i run the distributed shell example under capacity
> scheduler it shows the following exception.
>
> RemoteTrace:
> java.lang.NullPointerException
> at
> java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:796)
> at
> org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.CapacityScheduler.getQueueInfo(CapacityScheduler.java:507)
> at
> org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.getQueueInfo(ClientRMService.java:383)
> at
> org.apache.hadoop.yarn.api.impl.pb.service.ClientRMProtocolPBServiceImpl.getQueueInfo(ClientRMProtocolPBServiceImpl.java:181)
> at
> org.apache.hadoop.yarn.proto.ClientRMProtocol$ClientRMProtocolService$2.callBlockingMethod(ClientRMProtocol.java:188)
> at
> org.apache.hadoop.yarn.ipc.ProtoOverHadoopRpcEngine$Server.call(ProtoOverHadoopRpcEngine.java:352)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1608)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1604)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1167)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1602)
>  at LocalTrace:
> org.apache.hadoop.yarn.exceptions.impl.pb.YarnRemoteExceptionPBImpl:
> at
> org.apache.hadoop.yarn.ipc.ProtoOverHadoopRpcEngine$Invoker.invoke(ProtoOverHadoopRpcEngine.java:160)
> at $Proxy6.getQueueInfo(Unknown Source)
> at
> org.apache.hadoop.yarn.api.impl.pb.client.ClientRMProtocolPBClientImpl.getQueueInfo(ClientRMProtocolPBClientImpl.java:223)
> at
> org.apache.hadoop.yarn.applications.distributedshell.Client.run(Client.java:356)
> at
> org.apache.hadoop.yarn.applications.distributedshell.Client.main(Client.java:188)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:200)
>
>
> Thank You
>


Re: Read key and values from HDFS

2012-04-01 Thread Harsh J
Hi,

You can view an archive here: http://search-hadoop.com/m/5ltog03fnd2

On Sun, Apr 1, 2012 at 7:45 PM, Pedro Costa  wrote:
> Does anyone have answered this question? Because I can't find it.
>
> -- Forwarded message --
> From: Pedro Costa 
> Date: 30 March 2012 18:19
> Subject: Read key and values from HDFS
> To: mapreduce-user 
>
>
>
> The ReduceTask can save the file using several output format:
> InternalFileOutputFormant, SequenceFileOutputFormat, TeraOuputFormat, etc...
>
> How can I read the keys and the values from the output file? Can anyone give
> me an example? Is there a way to create just one method that can read all
> different outputformat?
>
>
> Thanks,
>
> --
> Best regards,
>
>
>
>
> --
> Best regards,
>



-- 
Harsh J


Fwd: Read key and values from HDFS

2012-04-01 Thread Pedro Costa
Does anyone have answered this question? Because I can't find it.

-- Forwarded message --
From: Pedro Costa 
Date: 30 March 2012 18:19
Subject: Read key and values from HDFS
To: mapreduce-user 



The ReduceTask can save the file using several output format:
InternalFileOutputFormant, SequenceFileOutputFormat, TeraOuputFormat, etc...

How can I read the keys and the values from the output file? Can anyone
give me an example? Is there a way to create just one method that can read
all different outputformat?


Thanks,

-- 
Best regards,




-- 
Best regards,