[ 
https://issues.apache.org/jira/browse/HDFS-4541?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13590926#comment-13590926
 ] 

Arpit Gupta commented on HDFS-4541:
-----------------------------------

No tests added as this is a change to shell scripts. Manually verified that the 
secure datanode logs are being written to the appropriate directory.

Test failure is unrelated.
                
> set hadoop.log.dir and hadoop.id.str when starting secure datanode so it 
> writes the logs to the correct dir by default
> ----------------------------------------------------------------------------------------------------------------------
>
>                 Key: HDFS-4541
>                 URL: https://issues.apache.org/jira/browse/HDFS-4541
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: datanode, security
>    Affects Versions: 2.0.3-alpha
>            Reporter: Arpit Gupta
>            Assignee: Arpit Gupta
>         Attachments: HDFS-4541.patch, HDFS-4541.patch
>
>
> currently in hadoop-config.sh we set the following
> {code}
> HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.log.dir=$HADOOP_LOG_DIR"
> HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.id.str=$HADOOP_IDENT_STRING"
> {code}
> however when this file is sourced we dont know whether we are starting a 
> secure data node.
> In the hdfs script when we determine whether we are starting secure data node 
> or not we should also update HADOOP_OPTS

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to