Neilxzn opened a new pull request #2768:
URL: https://github.com/apache/hadoop/pull/2768


   ## jira
   https://issues.apache.org/jira/browse/HDFS-15886
   
   We used protected dirs to ensure that important data directories cannot be 
deleted by mistake. But protected dirs can only be configured in hdfs-site.xml.
   
   For ease of management,  we add a way to get the list of protected dirs from 
a special configuration file.
   
   How to use.
   
   1. set the config in hdfs-site.xml
   
   ```
   
   <property>
   <name>fs.protected.directories</name>
   
<value>/hdfs/path/1,/hdfs/path/2,file:///path/to/protected.dirs.config</value>
   </property>
   
   ```
   
   2.  add some protected dirs to the config file 
(file:///path/to/protected.dirs.config)
   
   ```
   
   /hdfs/path/4
   
   /hdfs/path/5
   
   ```
   
   3. use command to refresh fs.protected.directories instead of 
FSDirectory.setProtectedDirectories(..)
   
   ```
   
   hdfs dfsadmin -refreshProtectedDirectories
   
   ```
   
    
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to