[ 
https://issues.apache.org/jira/browse/HADOOP-8973?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13598122#comment-13598122
 ] 

Ivan Mitic commented on HADOOP-8973:
------------------------------------

{quote}
Ivan, thanks for the help tracking this down. Very interesting find. I have 
confirmed that the existing tests fail when running as root on Linux, just like 
what you're experiencing when running as admin on Windows. It's unrelated to 
this patch. This might explain a few complaints I've seen on the mailing lists 
from new engineers trying to contribute to Hadoop, but finding some unit tests 
failing immediately. I'll try to track down those threads and help them out.

I agree with sticking to the File APIs, because they give us the "absolute 
truth" as you said. If the user is root/admin, then the disk is usable. 
{quote}

Thanks Chris for checking this out. Given that we have a symmetry with Unix I 
am +1 on the current patch. Maybe just add a comment to the test saying that it 
is expected to fail if run as root on Unix or in elevated context on Windows?

{quote}
We're just left with a test issue. We could change the tests to detect 
root/admin at runtime and then either skip the invalid tests with assumeTrue or 
run different assertions. On Linux, we could check the username against "root". 
On Windows, would it be sufficient to check that user is "Administrator", or is 
it more complex than that?
{quote}
For Windows, it won't be as simple as string compare. Even checking whether a 
user is a member of the Administrators group is not the right behavior. We have 
to check whether the process is running elevated or not. This could be done in 
native code or via a cmd script as described 
[here|http://stackoverflow.com/questions/7985755/how-to-detect-if-cmd-is-running-as-administrator-has-elevated-privileges].
 We could expose something like TestUtils#IsRunningElevated and disable/relax 
the test in that case, but this would be a separate Jira.

bq. What is the issue?
Bikas, the issue is that the test fails when executed in the elevated context 
on Windows, or as root on Linux. 
                
> DiskChecker cannot reliably detect an inaccessible disk on Windows with NTFS 
> ACLs
> ---------------------------------------------------------------------------------
>
>                 Key: HADOOP-8973
>                 URL: https://issues.apache.org/jira/browse/HADOOP-8973
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: util
>    Affects Versions: 3.0.0, 1-win
>            Reporter: Chris Nauroth
>            Assignee: Chris Nauroth
>         Attachments: DiskChecker.proto.patch, HADOOP-8973.3.patch, 
> HADOOP-8973-branch-1-win.3.patch, HADOOP-8973-branch-trunk-win.2.patch, 
> HADOOP-8973-branch-trunk-win.patch
>
>
> DiskChecker.checkDir uses File.canRead, File.canWrite, and File.canExecute to 
> check if a directory is inaccessible.  These APIs are not reliable on Windows 
> with NTFS ACLs due to a known JVM bug.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to