[ 
https://issues.apache.org/jira/browse/HADOOP-7503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13216286#comment-13216286
 ] 

Sho Shimauchi commented on HADOOP-7503:
---------------------------------------

I've talked with Shingo Furuyama and he has taken over this jira.
Could someone assign to him?
                
> Client#getRemotePrincipal NPEs when given invalid dfs.*.name
> ------------------------------------------------------------
>
>                 Key: HADOOP-7503
>                 URL: https://issues.apache.org/jira/browse/HADOOP-7503
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: ipc, security
>    Affects Versions: 0.20.203.0, 0.23.0
>            Reporter: Eli Collins
>            Assignee: Sho Shimauchi
>              Labels: newbie
>
> The following code in Client#getRemotePrincipal NPEs if security is enabled 
> and dfs.https.address, dfs.secondary.http.address, 
> dfs.secondary.https.address, or fs.default.name, has an invalid value (eg 
> hdfs://foo.bar.com.foo.bar.com:1000). We should check address.checkAddress() 
> for null (or check this earlier)  and give a more helpful error message.
> {noformat}
>   return SecurityUtil.getServerPrincipal(conf.get(serverKey), address
>     .getAddress().getCanonicalHostName());
> {noformat}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to