[jira] [Updated] (HADOOP-16457) Hadoop does not work without Kerberos for simple security

2019-07-24 Thread Eric Yang (JIRA)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16457?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Yang updated HADOOP-16457:
---
Description: 
When http filter initializers is setup to use StaticUserWebFilter, AuthFilter 
is still setup.  This prevents datanode to talk to namenode.

Error message in namenode logs:
{code}
2019-07-24 15:47:38,038 INFO org.apache.hadoop.hdfs.DFSUtil: Filter 
initializers set : 
org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
2019-07-24 16:06:26,212 WARN 
SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager:
 Authorization failed for hdfs (auth:SIMPLE) for protocol=interface 
org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol: this service is only 
accessible by dn/eyang-5.openstacklo...@example.com
{code}

Errors in datanode log:
{code}
2019-07-24 16:07:01,253 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: 
Problem connecting to server: eyang-1.openstacklocal/172.26.111.17:9000
{code}

The logic in HADOOP-16354 always added AuthFilter regardless security is 
enabled or not.  This is incorrect.  When simple security is chosen and using 
StaticUserWebFilter.  AutheFilter check should not be required for datanode to 
communicate with namenode.

  was:
When http filter initializers is setup to use StaticUserWebFilter, AuthFilter 
is still setup.  This prevents datanode to talk to namenode.

Error message in namenode logs:
{code}
2019-07-24 15:47:38,038 INFO org.apache.hadoop.hdfs.DFSUtil: Filter 
initializers set : 
org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
2019-07-24 16:06:26,212 WARN 
SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager:
 Authorization failed for hdfs (auth:SIMPLE) for protocol=interface 
org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol: this service is only 
accessible by dn/eyang-5.openstacklo...@example.com
{code}

Errors in datanode log:
{code}
2019-07-24 16:07:01,253 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: 
Problem connecting to server: eyang-1.openstacklocal/172.26.111.17:9000
{code}

The logic in HADOOP-16354 always added AuthFilter regardless which http filter 
initializer is chosen.  This is wrong.


> Hadoop does not work without Kerberos for simple security
> -
>
> Key: HADOOP-16457
> URL: https://issues.apache.org/jira/browse/HADOOP-16457
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.3.0
>Reporter: Eric Yang
>Assignee: Prabhu Joseph
>Priority: Major
>
> When http filter initializers is setup to use StaticUserWebFilter, AuthFilter 
> is still setup.  This prevents datanode to talk to namenode.
> Error message in namenode logs:
> {code}
> 2019-07-24 15:47:38,038 INFO org.apache.hadoop.hdfs.DFSUtil: Filter 
> initializers set : 
> org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
> 2019-07-24 16:06:26,212 WARN 
> SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager:
>  Authorization failed for hdfs (auth:SIMPLE) for protocol=interface 
> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol: this service is only 
> accessible by dn/eyang-5.openstacklo...@example.com
> {code}
> Errors in datanode log:
> {code}
> 2019-07-24 16:07:01,253 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: 
> Problem connecting to server: eyang-1.openstacklocal/172.26.111.17:9000
> {code}
> The logic in HADOOP-16354 always added AuthFilter regardless security is 
> enabled or not.  This is incorrect.  When simple security is chosen and using 
> StaticUserWebFilter.  AutheFilter check should not be required for datanode 
> to communicate with namenode.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16457) Hadoop does not work without Kerberos for simple security

2019-07-24 Thread Eric Yang (JIRA)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16457?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Yang updated HADOOP-16457:
---
Priority: Minor  (was: Major)

> Hadoop does not work without Kerberos for simple security
> -
>
> Key: HADOOP-16457
> URL: https://issues.apache.org/jira/browse/HADOOP-16457
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.3.0
>Reporter: Eric Yang
>Priority: Minor
>
> When http filter initializers is setup to use StaticUserWebFilter, AuthFilter 
> is still setup.  This prevents datanode to talk to namenode.
> Error message in namenode logs:
> {code}
> 2019-07-24 15:47:38,038 INFO org.apache.hadoop.hdfs.DFSUtil: Filter 
> initializers set : 
> org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
> 2019-07-24 16:06:26,212 WARN 
> SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager:
>  Authorization failed for hdfs (auth:SIMPLE) for protocol=interface 
> org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol: this service is only 
> accessible by dn/eyang-5.openstacklo...@example.com
> {code}
> Errors in datanode log:
> {code}
> 2019-07-24 16:07:01,253 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: 
> Problem connecting to server: eyang-1.openstacklocal/172.26.111.17:9000
> {code}
> The logic in HADOOP-16354 always added AuthFilter regardless security is 
> enabled or not.  This is incorrect.  When simple security is chosen and using 
> StaticUserWebFilter.  AutheFilter check should not be required for datanode 
> to communicate with namenode.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org