[ 
https://issues.apache.org/jira/browse/HDFS-15128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17023434#comment-17023434
 ] 

Hudson commented on HDFS-15128:
-------------------------------

SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #17899 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/17899/])
HDFS-15128. Unit test failing to clean testing data and crashed future 
(ayushsaxena: rev 6d008c0d39185f18dbec4676f4d0e7ef77104eb7)
* (edit) 
hadoop-hdfs-project/hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/server/datanode/TestDataNodeVolumeFailureToleration.java


> Unit test failing to clean testing data and crashed future Maven test run due 
> to failure in TestDataNodeVolumeFailureToleration
> -------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HDFS-15128
>                 URL: https://issues.apache.org/jira/browse/HDFS-15128
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: hdfs, test
>    Affects Versions: 3.2.1
>            Reporter: Ctest
>            Assignee: Ctest
>            Priority: Critical
>              Labels: easyfix, patch, test
>             Fix For: 3.3.0
>
>         Attachments: HDFS-15128-000.patch, HDFS-15128-001.patch
>
>
> Actively-used test helper function `testVolumeConfig` in 
> `org.apache.hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureToleration` 
> chmod a directory with invalid perm 000 for testing purposes but later failed 
> to chmod back this directory with a valid perm if the assertion inside this 
> function failed. Any subsequent `mvn test` command would fail to run if this 
> test had failed before. It is because Maven failed to build itself as it did 
> not have permission to clean the temporarily-generated directory that has 
> perm 000. See below for the code snippet that is buggy.
> {code:java}
> try {
>       for (int i = 0; i < volumesFailed; i++) {
>         prepareDirToFail(dirs[i]); // this will chmod dirs[i] to perm 000
>       }
>       restartDatanodes(volumesTolerated, manageDfsDirs);
>     } catch (DiskErrorException e) {
>      ...
>     } finally {
> ...
>     }
>  
>       assertEquals(expectedBPServiceState, bpServiceState);
>  
>       for (File dir : dirs) {
>         FileUtil.chmod(dir.toString(), "755");
>       }
>     }
> {code}
> The failure of the statement `assertEquals(expectedBPServiceState, 
> bpServiceState)` caused function to terminate without executing 
> `FileUtil.chmod(dir.toString(), "755")` for each temporary directory with 
> invalid perm 000 the test has created. 
>  
> *Consequence*
> Any subsequent `mvn test` command would fail to run if this test had failed 
> before. It is because Maven failed to build itself since it does not have 
> permission to clean this temporarily-generated directory. For details of the 
> failure, see below:
> {noformat}
> [INFO] --- maven-antrun-plugin:1.7:run (create-log-dir) @ hadoop-hdfs ---
> [INFO] Executing tasks
>  
> main:
> [delete] Deleting directory 
> /home/ctest/app/Ctest-Hadoop/hadoop-hdfs-project/hadoop-hdfs/target/test/data
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time:  8.349 s
> [INFO] Finished at: 2019-12-27T03:53:04-06:00
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute 
> goalorg.apache.maven.plugins:maven-antrun-plugin:1.7:run (create-log-dir) on 
> project hadoop-hdfs: An Ant BuildException has occured: Unable to delete 
> directory 
> /home/ctest/app/Ctest-Hadoop/hadoop-hdfs-project/hadoop-hdfs/target/test/data/dfs/data/data1/current
> [ERROR] around Ant part ...<delete 
> dir="/home/ctest/app/Ctest-Hadoop/hadoop-hdfs-project/hadoop-hdfs/target/test/data"/>...
>  @ 4:105 in 
> /home/ctest/app/Ctest-Hadoop/hadoop-hdfs-project/hadoop-hdfs/target/antrun/build-main.xml
> [ERROR] -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException{noformat}
>  
> *Root Cause*
> The test helper function 
> `org.apache.hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureToleration#testVolumeConfig`
>  purposely set the directory 
> `/home/ctest/app/Ctest-Hadoop/hadoop-hdfs-project/hadoop-hdfs/target/test/data/dfs/data/data1/current`
>  to have perm 000. And at the end of this function, it changed the perm of 
> this directory to 755. However, there is an assertion in this function before 
> the perm was able to changed to 755. Once this assertion fails, the function 
> terminates before the directory’s perm can be changed to 755. Hence, this 
> directory was later unable to be removed by Maven for when executing `mvn 
> test`. 
>  
> *Fix*
> In 
> `org.apache.hadoop.hdfs.server.datanode.TestDataNodeVolumeFailureToleration#testVolumeConfig`,
>  move the assertion `assertEquals(expectedBPServiceState, bpServiceState)`  
> to the last line of this function. This fix will fix the bug and will not 
> change the test outcome. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to