Any hint ?
On 14 Nov 2013 11:45, "rab ra" wrote:
> Thank for the response.
>
> I have removed rpc.protection parameter in all of my configuration and now
> I am getting an error as below:-
>
>
>
> Any Hint on whats going on here
>
>
>
> 13/11/14 10:10:47 INFO mapreduce.Job: Task Id :
> attempt_13
Thank for the response.
I have removed rpc.protection parameter in all of my configuration and now
I am getting an error as below:-
Any Hint on whats going on here
13/11/14 10:10:47 INFO mapreduce.Job: Task Id :
attempt_1384339616944_0002_m_26_0, Status : FAILED
Container launch failed
"No common protection layer between server and client " likely means the host
for job submission does not have hadoop.rpc.protection=privacy. In order for
QOP to work, all client hosts (DN & others used to access the cluster) must
have an identical setting.
A few quick questions: I'm assuming
Hello,
I am facing a problem in using Hadoop RPC encryption while transfer feature
in hadoop 2.2.0. I have 3 node cluster
Service running in node 1 (master)
Resource manager
Namenode
DataNode
SecondaryNamenode
Service running in slaves ( node 2 & 3)
NodeManager
I am trying to make data trans