Re: How to solve a DisallowedDatanodeException?

2011-10-07 Thread Raimon Bosch
Definetly it was an amazon problem. They were assigning a new internal ip
but some of the nodes were using the old one. I had to force on all my
/etc/hosts redirects from old dns's to the correct ips:

[NEW_IP]  ip-[OLD_IP].eu-west-1.compute.internal
[NEW_IP]  ip-[OLD_IP]

2011/10/7 Raimon Bosch 

> in the internal dns's sorry...
>
>
> 2011/10/7 Raimon Bosch 
>
>> My list of dfs.hosts was correct in all the servers. In this case I had a
>> problem with the internal DNS from amazon. I had to restart all my nodes to
>> getting rid of this problem.
>>
>> After some changes on my cluster (renaming nodes), some nodes had
>> automatically changed his IP and I had to perform a restart to force a
>> change also in the internal ip's.
>>
>>
>> 2011/10/7 Eric Fiala 
>>
>>> Raimon - the error
>>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> Datanode denied communication with namenode
>>>
>>> Usually indicates that the datanode that is trying to connect to the
>>> namenode is either:
>>>
>>>   - listed in the file defined by dfs.hosts.exclude (explicitly excluded)
>>> -
>>>   or
>>>   - that dfs.hosts (explicitly included) is used and the node is not
>>> listed
>>>   within that file
>>>
>>> Make sure the datanode is not listed in excludes, and if you are using
>>> dfs.hosts, add it to the includes, and run hadoop dfsadmin -refreshNodes
>>>
>>> You should not have to remove any data on local disc to solve this
>>> problem.
>>>
>>> HTH
>>>
>>> EF
>>>
>>> On Fri, Oct 7, 2011 at 4:47 AM, Raimon Bosch 
>>> wrote:
>>>
>>> > Hi,
>>> >
>>> > I'm running a cluster on amazon and sometimes I'm getting this
>>> exception:
>>> >
>>> > 2011-10-07 10:36:28,014 ERROR
>>> > org.apache.hadoop.hdfs.server.datanode.DataNode:
>>> > org.apache.hadoop.ipc.RemoteException:
>>> > org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>>> > Datanode
>>> > denied communication with namenode:
>>> > ip-10-235-57-112.eu-west-1.compute.internal:50010
>>> >at
>>> >
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2042)
>>> >at
>>> >
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
>>> >at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >at
>>> >
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> >at
>>> >
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> >at java.lang.reflect.Method.invoke(Method.java:597)
>>> >at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>>> >at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>>> >at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>>> >at java.security.AccessController.doPrivileged(Native Method)
>>> >at javax.security.auth.Subject.doAs(Subject.java:396)
>>> >at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>>> >
>>> >at org.apache.hadoop.ipc.Client.call(Client.java:740)
>>> >at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> >at $Proxy4.register(Unknown Source)
>>> >at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.register(DataNode.java:531)
>>> >at
>>> >
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.runDatanodeDaemon(DataNode.java:1208)
>>> >at
>>> >
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1247)
>>> >at
>>> >
>>> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368)
>>> >
>>> > Since I have this exception I'm not able to run any datanode. I have
>>> > checked
>>> > all the connections between the nodes and they are ok, I have tried
>>> also to
>>> > format the namenode but the problem is still remaining.
>>> >
>>> > Shall I need to remove the information about the datanode? rm -rf
>>> > ${HOME}/dfs-xvdh/dn
>>> >
>>> > I would prefer a solution that doesn't implies a format or erasing
>>> > anything...
>>> >
>>> >
>>> > Regards,
>>> > Raimon Bosch.
>>> >
>>>
>>>
>>>
>>> --
>>> *Eric Fiala*
>>> *Fiala Consulting*
>>> T: 403.828.1117
>>> E: e...@fiala.ca
>>> http://www.fiala.ca
>>>
>>
>>
>


Re: How to solve a DisallowedDatanodeException?

2011-10-07 Thread Raimon Bosch
in the internal dns's sorry...

2011/10/7 Raimon Bosch 

> My list of dfs.hosts was correct in all the servers. In this case I had a
> problem with the internal DNS from amazon. I had to restart all my nodes to
> getting rid of this problem.
>
> After some changes on my cluster (renaming nodes), some nodes had
> automatically changed his IP and I had to perform a restart to force a
> change also in the internal ip's.
>
>
> 2011/10/7 Eric Fiala 
>
>> Raimon - the error
>> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> Datanode denied communication with namenode
>>
>> Usually indicates that the datanode that is trying to connect to the
>> namenode is either:
>>
>>   - listed in the file defined by dfs.hosts.exclude (explicitly excluded)
>> -
>>   or
>>   - that dfs.hosts (explicitly included) is used and the node is not
>> listed
>>   within that file
>>
>> Make sure the datanode is not listed in excludes, and if you are using
>> dfs.hosts, add it to the includes, and run hadoop dfsadmin -refreshNodes
>>
>> You should not have to remove any data on local disc to solve this
>> problem.
>>
>> HTH
>>
>> EF
>>
>> On Fri, Oct 7, 2011 at 4:47 AM, Raimon Bosch 
>> wrote:
>>
>> > Hi,
>> >
>> > I'm running a cluster on amazon and sometimes I'm getting this
>> exception:
>> >
>> > 2011-10-07 10:36:28,014 ERROR
>> > org.apache.hadoop.hdfs.server.datanode.DataNode:
>> > org.apache.hadoop.ipc.RemoteException:
>> > org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
>> > Datanode
>> > denied communication with namenode:
>> > ip-10-235-57-112.eu-west-1.compute.internal:50010
>> >at
>> >
>> >
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2042)
>> >at
>> >
>> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
>> >at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >at
>> >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >at
>> >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >at java.lang.reflect.Method.invoke(Method.java:597)
>> >at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>> >at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>> >at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>> >at java.security.AccessController.doPrivileged(Native Method)
>> >at javax.security.auth.Subject.doAs(Subject.java:396)
>> >at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>> >
>> >at org.apache.hadoop.ipc.Client.call(Client.java:740)
>> >at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> >at $Proxy4.register(Unknown Source)
>> >at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.register(DataNode.java:531)
>> >at
>> >
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.runDatanodeDaemon(DataNode.java:1208)
>> >at
>> >
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1247)
>> >at
>> > org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368)
>> >
>> > Since I have this exception I'm not able to run any datanode. I have
>> > checked
>> > all the connections between the nodes and they are ok, I have tried also
>> to
>> > format the namenode but the problem is still remaining.
>> >
>> > Shall I need to remove the information about the datanode? rm -rf
>> > ${HOME}/dfs-xvdh/dn
>> >
>> > I would prefer a solution that doesn't implies a format or erasing
>> > anything...
>> >
>> >
>> > Regards,
>> > Raimon Bosch.
>> >
>>
>>
>>
>> --
>> *Eric Fiala*
>> *Fiala Consulting*
>> T: 403.828.1117
>> E: e...@fiala.ca
>> http://www.fiala.ca
>>
>
>


Re: How to solve a DisallowedDatanodeException?

2011-10-07 Thread Raimon Bosch
My list of dfs.hosts was correct in all the servers. In this case I had a
problem with the internal DNS from amazon. I had to restart all my nodes to
getting rid of this problem.

After some changes on my cluster (renaming nodes), some nodes had
automatically changed his IP and I had to perform a restart to force a
change also in the internal ip's.

2011/10/7 Eric Fiala 

> Raimon - the error
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode denied communication with namenode
>
> Usually indicates that the datanode that is trying to connect to the
> namenode is either:
>
>   - listed in the file defined by dfs.hosts.exclude (explicitly excluded) -
>   or
>   - that dfs.hosts (explicitly included) is used and the node is not listed
>   within that file
>
> Make sure the datanode is not listed in excludes, and if you are using
> dfs.hosts, add it to the includes, and run hadoop dfsadmin -refreshNodes
>
> You should not have to remove any data on local disc to solve this problem.
>
> HTH
>
> EF
>
> On Fri, Oct 7, 2011 at 4:47 AM, Raimon Bosch 
> wrote:
>
> > Hi,
> >
> > I'm running a cluster on amazon and sometimes I'm getting this exception:
> >
> > 2011-10-07 10:36:28,014 ERROR
> > org.apache.hadoop.hdfs.server.datanode.DataNode:
> > org.apache.hadoop.ipc.RemoteException:
> > org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> > Datanode
> > denied communication with namenode:
> > ip-10-235-57-112.eu-west-1.compute.internal:50010
> >at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2042)
> >at
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
> >at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >at java.lang.reflect.Method.invoke(Method.java:597)
> >at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
> >at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
> >at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
> >at java.security.AccessController.doPrivileged(Native Method)
> >at javax.security.auth.Subject.doAs(Subject.java:396)
> >at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
> >
> >at org.apache.hadoop.ipc.Client.call(Client.java:740)
> >at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> >at $Proxy4.register(Unknown Source)
> >at
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.register(DataNode.java:531)
> >at
> >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.runDatanodeDaemon(DataNode.java:1208)
> >at
> >
> >
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1247)
> >at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368)
> >
> > Since I have this exception I'm not able to run any datanode. I have
> > checked
> > all the connections between the nodes and they are ok, I have tried also
> to
> > format the namenode but the problem is still remaining.
> >
> > Shall I need to remove the information about the datanode? rm -rf
> > ${HOME}/dfs-xvdh/dn
> >
> > I would prefer a solution that doesn't implies a format or erasing
> > anything...
> >
> >
> > Regards,
> > Raimon Bosch.
> >
>
>
>
> --
> *Eric Fiala*
> *Fiala Consulting*
> T: 403.828.1117
> E: e...@fiala.ca
> http://www.fiala.ca
>


Re: How to solve a DisallowedDatanodeException?

2011-10-07 Thread Eric Fiala
Raimon - the error
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
Datanode denied communication with namenode

Usually indicates that the datanode that is trying to connect to the
namenode is either:

   - listed in the file defined by dfs.hosts.exclude (explicitly excluded) -
   or
   - that dfs.hosts (explicitly included) is used and the node is not listed
   within that file

Make sure the datanode is not listed in excludes, and if you are using
dfs.hosts, add it to the includes, and run hadoop dfsadmin -refreshNodes

You should not have to remove any data on local disc to solve this problem.

HTH

EF

On Fri, Oct 7, 2011 at 4:47 AM, Raimon Bosch  wrote:

> Hi,
>
> I'm running a cluster on amazon and sometimes I'm getting this exception:
>
> 2011-10-07 10:36:28,014 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode:
> org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException:
> Datanode
> denied communication with namenode:
> ip-10-235-57-112.eu-west-1.compute.internal:50010
>at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2042)
>at
> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>at java.lang.reflect.Method.invoke(Method.java:597)
>at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>at java.security.AccessController.doPrivileged(Native Method)
>at javax.security.auth.Subject.doAs(Subject.java:396)
>at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>at org.apache.hadoop.ipc.Client.call(Client.java:740)
>at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>at $Proxy4.register(Unknown Source)
>at
> org.apache.hadoop.hdfs.server.datanode.DataNode.register(DataNode.java:531)
>at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.runDatanodeDaemon(DataNode.java:1208)
>at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1247)
>at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368)
>
> Since I have this exception I'm not able to run any datanode. I have
> checked
> all the connections between the nodes and they are ok, I have tried also to
> format the namenode but the problem is still remaining.
>
> Shall I need to remove the information about the datanode? rm -rf
> ${HOME}/dfs-xvdh/dn
>
> I would prefer a solution that doesn't implies a format or erasing
> anything...
>
>
> Regards,
> Raimon Bosch.
>



-- 
*Eric Fiala*
*Fiala Consulting*
T: 403.828.1117
E: e...@fiala.ca
http://www.fiala.ca


How to solve a DisallowedDatanodeException?

2011-10-07 Thread Raimon Bosch
Hi,

I'm running a cluster on amazon and sometimes I'm getting this exception:

2011-10-07 10:36:28,014 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode:
org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode
denied communication with namenode:
ip-10-235-57-112.eu-west-1.compute.internal:50010
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2042)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

at org.apache.hadoop.ipc.Client.call(Client.java:740)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy4.register(Unknown Source)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.register(DataNode.java:531)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.runDatanodeDaemon(DataNode.java:1208)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1247)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1368)

Since I have this exception I'm not able to run any datanode. I have checked
all the connections between the nodes and they are ok, I have tried also to
format the namenode but the problem is still remaining.

Shall I need to remove the information about the datanode? rm -rf
${HOME}/dfs-xvdh/dn

I would prefer a solution that doesn't implies a format or erasing
anything...


Regards,
Raimon Bosch.