[ 
https://issues.apache.org/jira/browse/HDFS-4541?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13590798#comment-13590798
 ] 

Chris Nauroth commented on HDFS-4541:
-------------------------------------

Hi, Arpit.  With this patch, does a secure data node end up with 2 occurrences 
of -Dhadoop.log.dir and -Dhadoop.id.str on the command line?  It appears that 
there would be one occurrence generated in the hadoop-config.sh script and then 
a second occurrence appended in the hdfs script (with different values).  This 
could be confusing for an operator looking at the process table.

Thanks!

                
> set hadoop.log.dir and hadoop.id.str when starting secure datanode so it 
> writes the logs to the correct dir by default
> ----------------------------------------------------------------------------------------------------------------------
>
>                 Key: HDFS-4541
>                 URL: https://issues.apache.org/jira/browse/HDFS-4541
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: datanode, security
>    Affects Versions: 2.0.3-alpha
>            Reporter: Arpit Gupta
>            Assignee: Arpit Gupta
>         Attachments: HDFS-4541.patch, HDFS-4541.patch
>
>
> currently in hadoop-config.sh we set the following
> {code}
> HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.log.dir=$HADOOP_LOG_DIR"
> HADOOP_OPTS="$HADOOP_OPTS -Dhadoop.id.str=$HADOOP_IDENT_STRING"
> {code}
> however when this file is sourced we dont know whether we are starting a 
> secure data node.
> In the hdfs script when we determine whether we are starting secure data node 
> or not we should also update HADOOP_OPTS

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to