[ 
https://issues.apache.org/jira/browse/HDFS-16103?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17372611#comment-17372611
 ] 

Steve Loughran commented on HDFS-16103:
---------------------------------------

looks like something is very odd with your local filesystem. All tests which 
lock down local FS permissions and then try to read/write/delete expecting an 
error are failing because the FS operation *Worked*

# what OS are you running on?
# what filesystem contains the hadoop source?
# how much space is there?
# you running these tests as a superuser? that could explain the problem

Unless you can find similar error messages in other JIRAs, you are going to 
have to debug it yourself. Add the project to an IDE, set a breakpoint on the 
at test case which fails, and step through to see where things diverged

> mvn test failed about hadoop@3.2.1
> ----------------------------------
>
>                 Key: HDFS-16103
>                 URL: https://issues.apache.org/jira/browse/HDFS-16103
>             Project: Hadoop HDFS
>          Issue Type: Bug
>    Affects Versions: 3.2.1
>            Reporter: shixijun
>            Priority: Major
>
> {panel:title=mvn test failed about hadoop@3.2.1}
> mvn test failed
> {panel}
> [root@localhost spack-src]# mvn -version
> Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
> Maven home: 
> /home/all_spack_env/spack/opt/spack/linux-centos8-aarch64/gcc-8.4.1/maven-3.6.3-fpgpwvz7es5yiaz2tez2pnlilrcatuvg
> Java version: 1.8.0_191, vendor: AdoptOpenJdk, runtime: 
> /home/all_spack_env/spack/opt/spack/linux-centos8-aarch64/gcc-8.4.1/openjdk-1.8.0_191-b12-fidptihybskgklbjoo4lagkacm6n6lod/jre
> Default locale: en_US, platform encoding: ANSI_X3.4-1968
> OS name: "linux", version: "4.18.0-80.el8.aarch64", arch: "aarch64", family: 
> "unix"
> [root@localhost spack-src]# java -version
> openjdk version "1.8.0_191"
> OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_191-b12)
> OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.191-b12, mixed mode)
> [root@localhost spack-src]# mvn test
> ……
> [INFO] Running org.apache.hadoop.tools.TestCommandShell
> [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.111 
> s - in org.apache.hadoop.tools.TestCommandShell
> [INFO]
> [INFO] Results:
> [INFO]
> [ERROR] Failures:
> [ERROR]   
> TestFSMainOperationsLocalFileSystem>FSMainOperationsBaseTest.testGlobStatusThrowsExceptionForUnreadableDir:643
>  Should throw IOException
> [ERROR]   
> TestFSMainOperationsLocalFileSystem>FSMainOperationsBaseTest.testListStatusThrowsExceptionForUnreadableDir:288
>  Should throw IOException
> [ERROR]   
> TestFileUtil.testFailFullyDelete:446->validateAndSetWritablePermissions:422 
> The directory xSubDir *should* not have been deleted. expected:<true> but 
> was:<false>
> [ERROR]   
> TestFileUtil.testFailFullyDeleteContents:525->validateAndSetWritablePermissions:422
>  The directory xSubDir *should* not have been deleted. expected:<true> but 
> was:<false>
> [ERROR]   TestFileUtil.testGetDU:571
> [ERROR]   TestFsShellCopy.testPutSrcDirNoPerm:627->shellRun:80 expected:<1> 
> but was:<0>
> [ERROR]   TestFsShellCopy.testPutSrcFileNoPerm:652->shellRun:80 expected:<1> 
> but was:<0>
> [ERROR]   TestLocalDirAllocator.test0:140->validateTempDirCreation:109 
> Checking for build/test/temp/RELATIVE1 in 
> build/test/temp/RELATIVE0/block995011826146306285.tmp - FAILED!
> [ERROR]   TestLocalDirAllocator.test0:140->validateTempDirCreation:109 
> Checking for 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/ABSOLUTE1
>  in 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/ABSOLUTE0/block792666236482175348.tmp
>  - FAILED!
> [ERROR]   TestLocalDirAllocator.test0:141->validateTempDirCreation:109 
> Checking for 
> file:/home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/QUALIFIED1
>  in 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/QUALIFIED0/block5124616846677903649.tmp
>  - FAILED!
> [ERROR]   
> TestLocalDirAllocator.testROBufferDirAndRWBufferDir:162->validateTempDirCreation:109
>  Checking for build/test/temp/RELATIVE2 in 
> build/test/temp/RELATIVE1/block1176062344115776027.tmp - FAILED!
> [ERROR]   
> TestLocalDirAllocator.testROBufferDirAndRWBufferDir:163->validateTempDirCreation:109
>  Checking for 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/ABSOLUTE2
>  in 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/ABSOLUTE1/block3514694215643608527.tmp
>  - FAILED!
> [ERROR]   
> TestLocalDirAllocator.testROBufferDirAndRWBufferDir:163->validateTempDirCreation:109
>  Checking for 
> file:/home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/QUALIFIED2
>  in 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/QUALIFIED1/block883026101475466701.tmp
>  - FAILED!
> [ERROR]   
> TestLocalDirAllocator.testRWBufferDirBecomesRO:219->validateTempDirCreation:109
>  Checking for build/test/temp/RELATIVE3 in 
> build/test/temp/RELATIVE4/block2198073115547564040.tmp - FAILED!
> [ERROR]   
> TestLocalDirAllocator.testRWBufferDirBecomesRO:219->validateTempDirCreation:109
>  Checking for 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/ABSOLUTE3
>  in 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/ABSOLUTE4/block4187087898130713397.tmp
>  - FAILED!
> [ERROR]   
> TestLocalDirAllocator.testRWBufferDirBecomesRO:219->validateTempDirCreation:109
>  Checking for 
> file:/home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/QUALIFIED3
>  in 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/build/test/temp/QUALIFIED4/block7779116721351278125.tmp
>  - FAILED!
> [ERROR]   TestLocalFileSystem.testReportChecksumFailure:390
> [ERROR]   TestPathData.testGlobThrowsExceptionForUnreadableDir:230 Should 
> throw IOException
> [ERROR]   
> TestFSMainOperationsLocalFileSystem>FSMainOperationsBaseTest.testGlobStatusThrowsExceptionForUnreadableDir:643
>  Should throw IOException
> [ERROR]   
> TestFSMainOperationsLocalFileSystem>FSMainOperationsBaseTest.testListStatusThrowsExceptionForUnreadableDir:288
>  Should throw IOException
> [ERROR]   TestSharedFileDescriptorFactory.testDirectoryFallbacks:103
> [ERROR]   TestRollingFileSystemSinkWithLocal.testFailedWrite:117 No exception 
> was generated while writing metrics even though the target directory was not 
> writable
> [ERROR]   
> TestBasicDiskValidator>TestDiskChecker.testCheckDir_notListable:131->TestDiskChecker._checkDirs:164
>  checkDir success, expected failure
> [ERROR]   
> TestBasicDiskValidator>TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:40
>  call to checkDir() succeeded.
> [ERROR]   
> TestBasicDiskValidator>TestDiskChecker.testCheckDir_notReadable:121->TestDiskChecker._checkDirs:164
>  checkDir success, expected failure
> [ERROR]   
> TestBasicDiskValidator>TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:40
>  call to checkDir() succeeded.
> [ERROR]   
> TestBasicDiskValidator>TestDiskChecker.testCheckDir_notWritable:126->TestDiskChecker._checkDirs:164
>  checkDir success, expected failure
> [ERROR]   
> TestBasicDiskValidator>TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:40
>  call to checkDir() succeeded.
> [ERROR]   TestDiskChecker.testCheckDir_notListable:131->_checkDirs:164 
> checkDir success, expected failure
> [ERROR]   TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:210 
> checkDir success, expected failure
> [ERROR]   TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:164 
> checkDir success, expected failure
> [ERROR]   TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:210 
> checkDir success, expected failure
> [ERROR]   TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:164 
> checkDir success, expected failure
> [ERROR]   TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:210 
> checkDir success, expected failure
> [ERROR]   TestReadWriteDiskValidator.testCheckFailures:127 Disk check should 
> fail.
> [ERROR] Errors:
> [ERROR]   TestNativeIO.testMultiThreadedStat:249 ? Execution 
> java.lang.IllegalArgumentEx...
> [ERROR]   TestNativeIO.testStat:183->doStatTest:205 ? IllegalArgument length 
> != 10(unixS...
> [ERROR]   TestRPCCallBenchmark.testBenchmarkWithProto:30->Object.wait:-2 ?  
> test timed o...
> [ERROR]   TestShell.testEnvVarsWithInheritance:159->testEnvHelper:180 ? 
> StringIndexOutOfBounds
> [INFO]
> [ERROR] Tests run: 4192, Failures: 35, Errors: 4, Skipped: 252
> [INFO]
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary for Apache Hadoop Common 3.2.1:
> [INFO]
> [INFO] Apache Hadoop Common ............................... FAILURE [24:57 
> min]
> [INFO] Apache Hadoop NFS .................................. SKIPPED
> [INFO] Apache Hadoop KMS .................................. SKIPPED
> [INFO] Apache Hadoop Common Project ....................... SKIPPED
> [INFO] Apache Hadoop HDFS Client .......................... SKIPPED
> [INFO] Apache Hadoop HDFS ................................. SKIPPED
> [INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
> [INFO] Apache Hadoop HttpFS ............................... SKIPPED
> [INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
> [INFO] Apache Hadoop HDFS-RBF ............................. SKIPPED
> [INFO] Apache Hadoop HDFS Project ......................... SKIPPED
> [INFO] Apache Hadoop YARN ................................. SKIPPED
> [INFO] Apache Hadoop YARN API ............................. SKIPPED
> [INFO] Apache Hadoop YARN Common .......................... SKIPPED
> [INFO] Apache Hadoop YARN Registry ........................ SKIPPED
> [INFO] Apache Hadoop YARN Server .......................... SKIPPED
> [INFO] Apache Hadoop YARN Server Common ................... SKIPPED
> [INFO] Apache Hadoop YARN NodeManager ..................... SKIPPED
> [INFO] Apache Hadoop YARN Web Proxy ....................... SKIPPED
> [INFO] Apache Hadoop YARN ApplicationHistoryService ....... SKIPPED
> [INFO] Apache Hadoop YARN Timeline Service ................ SKIPPED
> [INFO] Apache Hadoop YARN ResourceManager ................. SKIPPED
> [INFO] Apache Hadoop YARN Server Tests .................... SKIPPED
> [INFO] Apache Hadoop YARN Client .......................... SKIPPED
> [INFO] Apache Hadoop YARN SharedCacheManager .............. SKIPPED
> [INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SKIPPED
> [INFO] Apache Hadoop YARN TimelineService HBase Backend ... SKIPPED
> [INFO] Apache Hadoop YARN TimelineService HBase Common .... SKIPPED
> [INFO] Apache Hadoop YARN TimelineService HBase Client .... SKIPPED
> [INFO] Apache Hadoop YARN TimelineService HBase Servers ... SKIPPED
> [INFO] Apache Hadoop YARN TimelineService HBase Server 1.2  SKIPPED
> [INFO] Apache Hadoop YARN TimelineService HBase tests ..... SKIPPED
> [INFO] Apache Hadoop YARN Router .......................... SKIPPED
> [INFO] Apache Hadoop YARN Applications .................... SKIPPED
> [INFO] Apache Hadoop YARN DistributedShell ................ SKIPPED
> [INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SKIPPED
> [INFO] Apache Hadoop MapReduce Client ..................... SKIPPED
> [INFO] Apache Hadoop MapReduce Core ....................... SKIPPED
> [INFO] Apache Hadoop MapReduce Common ..................... SKIPPED
> [INFO] Apache Hadoop MapReduce Shuffle .................... SKIPPED
> [INFO] Apache Hadoop MapReduce App ........................ SKIPPED
> [INFO] Apache Hadoop MapReduce HistoryServer .............. SKIPPED
> [INFO] Apache Hadoop MapReduce JobClient .................. SKIPPED
> [INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
> [INFO] Apache Hadoop YARN Services ........................ SKIPPED
> [INFO] Apache Hadoop YARN Services Core ................... SKIPPED
> [INFO] Apache Hadoop YARN Services API .................... SKIPPED
> [INFO] Apache Hadoop Image Generation Tool ................ SKIPPED
> [INFO] Yet Another Learning Platform ...................... SKIPPED
> [INFO] Apache Hadoop YARN Site ............................ SKIPPED
> [INFO] Apache Hadoop YARN UI .............................. SKIPPED
> [INFO] Apache Hadoop YARN Project ......................... SKIPPED
> [INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SKIPPED
> [INFO] Apache Hadoop MapReduce NativeTask ................. SKIPPED
> [INFO] Apache Hadoop MapReduce Uploader ................... SKIPPED
> [INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
> [INFO] Apache Hadoop MapReduce ............................ SKIPPED
> [INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
> [INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
> [INFO] Apache Hadoop Archives ............................. SKIPPED
> [INFO] Apache Hadoop Archive Logs ......................... SKIPPED
> [INFO] Apache Hadoop Rumen ................................ SKIPPED
> [INFO] Apache Hadoop Gridmix .............................. SKIPPED
> [INFO] Apache Hadoop Data Join ............................ SKIPPED
> [INFO] Apache Hadoop Extras ............................... SKIPPED
> [INFO] Apache Hadoop Pipes ................................ SKIPPED
> [INFO] Apache Hadoop OpenStack support .................... SKIPPED
> [INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
> [INFO] Apache Hadoop Kafka Library support ................ SKIPPED
> [INFO] Apache Hadoop Azure support ........................ SKIPPED
> [INFO] Apache Hadoop Aliyun OSS support ................... SKIPPED
> [INFO] Apache Hadoop Client Aggregator .................... SKIPPED
> [INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
> [INFO] Apache Hadoop Resource Estimator Service ........... SKIPPED
> [INFO] Apache Hadoop Azure Data Lake support .............. SKIPPED
> [INFO] Apache Hadoop Tools Dist ........................... SKIPPED
> [INFO] Apache Hadoop Tools ................................ SKIPPED
> [INFO] Apache Hadoop Client API ........................... SKIPPED
> [INFO] Apache Hadoop Client Runtime ....................... SKIPPED
> [INFO] Apache Hadoop Client Packaging Invariants .......... SKIPPED
> [INFO] Apache Hadoop Client Test Minicluster .............. SKIPPED
> [INFO] Apache Hadoop Client Packaging Invariants for Test . SKIPPED
> [INFO] Apache Hadoop Client Packaging Integration Tests ... SKIPPED
> [INFO] Apache Hadoop Distribution ......................... SKIPPED
> [INFO] Apache Hadoop Client Modules ....................... SKIPPED
> [INFO] Apache Hadoop Cloud Storage ........................ SKIPPED
> [INFO] Apache Hadoop Cloud Storage Project ................ SKIPPED
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time:  24:59 min
> [INFO] Finished at: 2021-06-30T17:24:03+08:00
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M1:test (default-test) 
> on project hadoop-common: There are test failures.
> [ERROR]
> [ERROR] Please refer to 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/target/surefire-reports
>  for the individual test results.
> [ERROR] Please refer to dump files (if any exist) [date].dump, 
> [date]-jvmRun[N].dump and [date].dumpstream.
> [ERROR] -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute 
> goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M1:test 
> (default-test) on project hadoop-common: There are test failures.
> Please refer to 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/target/surefire-reports
>  for the individual test results.
> Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump 
> and [date].dumpstream.
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
> (MojoExecutor.java:215)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
> (MojoExecutor.java:156)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
> (MojoExecutor.java:148)
>     at 
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
> (LifecycleModuleBuilder.java:117)
>     at 
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
> (LifecycleModuleBuilder.java:81)
>     at 
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>  (SingleThreadedBuilder.java:56)
>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute 
> (LifecycleStarter.java:128)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced 
> (Launcher.java:282)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch 
> (Launcher.java:225)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode 
> (Launcher.java:406)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.main 
> (Launcher.java:347)
> Caused by: org.apache.maven.plugin.MojoFailureException: There are test 
> failures.
> Please refer to 
> /home/all_spack_env/spack_stage/root/spack-stage-hadoop-3.2.1-xvpobktnlicqhfzwbkriy4cick5tpsab/spack-src/hadoop-common-project/hadoop-common/target/surefire-reports
>  for the individual test results.
> Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump 
> and [date].dumpstream.
>     at org.apache.maven.plugin.surefire.SurefireHelper.throwException 
> (SurefireHelper.java:271)
>     at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution 
> (SurefireHelper.java:159)
>     at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary 
> (SurefirePlugin.java:362)
>     at 
> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
>  (AbstractSurefireMojo.java:1007)
>     at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute 
> (AbstractSurefireMojo.java:837)
>     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo 
> (DefaultBuildPluginManager.java:137)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
> (MojoExecutor.java:210)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
> (MojoExecutor.java:156)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute 
> (MojoExecutor.java:148)
>     at 
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
> (LifecycleModuleBuilder.java:117)
>     at 
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject 
> (LifecycleModuleBuilder.java:81)
>     at 
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>  (SingleThreadedBuilder.java:56)
>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute 
> (LifecycleStarter.java:128)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke 
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke 
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:498)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced 
> (Launcher.java:282)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch 
> (Launcher.java:225)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode 
> (Launcher.java:406)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.main 
> (Launcher.java:347)
> [ERROR]
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> ----
> Can you help me find out what caused it and how to solve it?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to