[ 
https://issues.apache.org/jira/browse/HADOOP-15716?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16614619#comment-16614619
 ] 

Steve Loughran commented on HADOOP-15716:
-----------------------------------------

bq. As you see in my build command I am choosing the hdds and dist profiles 
exclusively.

what happens if you start the day with a full local (test skipped) build? 

{code}
mvn -T 1C install -skipTests
{code}

then do the specific profiles. 

FWIW, I do that and have tabbed terminal windows in the different subprojects; 
stops me accidentally kicking off a full build. I am using macos, and I'm not 
creating native libs without the -Pnative option on a full build. 


> native library dependency on the very first build
> -------------------------------------------------
>
>                 Key: HADOOP-15716
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15716
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build, common
>    Affects Versions: 3.2.0
>         Environment: [INFO] Detecting the operating system and CPU 
> architecture
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] os.detected.name: osx
> [INFO] os.detected.arch: x86_64
> [INFO] os.detected.version: 10.13
> [INFO] os.detected.version.major: 10
> [INFO] os.detected.version.minor: 13
> [INFO] os.detected.classifier: osx-x86_64
>            Reporter: Sree Vaddi
>            Priority: Major
>
> When building hadoop (hdds exactly, but hadoop, too) for the very first time, 
> Tests fails due to the dependency on the native lib (missing libhadoop.so).  
> As a work around, one can get past by skipping tests.  But it sounds chicken 
> & egg situation, to have installed 'libhadoop.so' before building hadoop for 
> the very first time.
>  
> Suggestion to have a first time flag or some logic figure it, then skip the 
> failing tests and/or compile/install libhadoop.so before running those 
> failing tests.
>  
>  
> HW14169:hadoop svaddi$ mvn clean package install -Phdds -Pdist -Dtar
> [INFO] Running org.apache.hadoop.util.TestTime
> [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 s 
> - in org.apache.hadoop.util.TestTime
> [INFO] Running org.apache.hadoop.util.TestNativeCodeLoader
> [ERROR] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.117 
> s <<< FAILURE! - in org.apache.hadoop.util.TestNativeCodeLoader
> [ERROR] testNativeCodeLoaded(org.apache.hadoop.util.TestNativeCodeLoader)  
> Time elapsed: 0.027 s  <<< FAILURE!
> java.lang.AssertionError: TestNativeCodeLoader: libhadoop.so testing was 
> required, but libhadoop.so was not loaded.
>     at org.junit.Assert.fail(Assert.java:88)
>     at 
> org.apache.hadoop.util.TestNativeCodeLoader.testNativeCodeLoaded(TestNativeCodeLoader.java:48)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>     at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>     at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>     at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>     at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>     at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>     at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>     at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>     at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>     at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>     at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:379)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:340)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:125)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:413)
> [INFO] Running org.apache.hadoop.util.TestLightWeightCache
> [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.516 
> s - in org.apache.hadoop.util.TestLightWeightCache
> [INFO] Running org.apache.hadoop.io.compress.lz4.TestLz4CompressorDecompressor
> [WARNING] Tests run: 13, Failures: 0, Errors: 0, Skipped: 13, Time elapsed: 
> 0.128 s - in org.apache.hadoop.io.compress.lz4.TestLz4CompressorDecompressor
> [INFO] Running org.apache.hadoop.io.compress.TestCodec
> [ERROR] Tests run: 26, Failures: 1, Errors: 0, Skipped: 5, Time elapsed: 
> 55.533 s <<< FAILURE! - in org.apache.hadoop.io.compress.TestCodec
> [ERROR] 
> testCodecPoolCompressorReinit(org.apache.hadoop.io.compress.TestCodec)  Time 
> elapsed: 0.031 s  <<< FAILURE!
> java.lang.AssertionError: Compressed bytes contrary to configuration
>     at org.junit.Assert.fail(Assert.java:88)
>     at org.junit.Assert.assertTrue(Assert.java:41)
>     at 
> org.apache.hadoop.io.compress.TestCodec.gzipReinitTest(TestCodec.java:431)
>     at 
> org.apache.hadoop.io.compress.TestCodec.testCodecPoolCompressorReinit(TestCodec.java:502)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>     at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>     at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>     at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>     at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>     at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>     at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>     at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>     at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>     at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>     at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>     at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>     at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:379)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:340)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:125)
>     at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:413)
> [INFO] Running 
> org.apache.hadoop.io.compress.zlib.TestZlibCompressorDecompressor
> [WARNING] Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 
> 0.293 s - in org.apache.hadoop.io.compress.zlib.TestZlibCompressorDecompressor
> [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.371 
> s - in org.apache.hadoop.ipc.TestProtoBufRpc
> [INFO] Running org.apache.hadoop.ipc.TestIPC
> [ERROR] Tests run: 39, Failures: 0, Errors: 2, Skipped: 1, Time elapsed: 
> 88.727 s <<< FAILURE! - in org.apache.hadoop.ipc.TestIPC
> [ERROR] testHttpGetResponse(org.apache.hadoop.ipc.TestIPC)  Time elapsed: 
> 0.016 s  <<< ERROR!
> java.net.SocketException: Connection reset
>     at java.net.SocketInputStream.read(SocketInputStream.java:210)
>     at java.net.SocketInputStream.read(SocketInputStream.java:141)
>     at java.net.SocketInputStream.read(SocketInputStream.java:127)
>     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:100)
>     at org.apache.hadoop.ipc.TestIPC.doIpcVersionTest(TestIPC.java:1579)
>     at org.apache.hadoop.ipc.TestIPC.testHttpGetResponse(TestIPC.java:1074)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>     at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>     at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at 
> org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
> [ERROR] testIpcFromHadoop_0_18_13(org.apache.hadoop.ipc.TestIPC)  Time 
> elapsed: 0.009 s  <<< ERROR!
> java.net.SocketException: Connection reset
>     at java.net.SocketInputStream.read(SocketInputStream.java:210)
>     at java.net.SocketInputStream.read(SocketInputStream.java:141)
>     at java.net.SocketInputStream.read(SocketInputStream.java:127)
>     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:100)
>     at org.apache.hadoop.ipc.TestIPC.doIpcVersionTest(TestIPC.java:1579)
>     at 
> org.apache.hadoop.ipc.TestIPC.testIpcFromHadoop_0_18_13(TestIPC.java:1056)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>     at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>     at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at 
> org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
> [INFO] Running org.apache.hadoop.ipc.TestRPCWaitForProxy
> [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.31 
> s - in org.apache.hadoop.ipc.TestRPCWaitForProxy
> [INFO] Running org.apache.hadoop.fs.TestRawLocalFileSystemContract
> [ERROR] Tests run: 44, Failures: 0, Errors: 1, Skipped: 18, Time elapsed: 
> 0.981 s <<< FAILURE! - in org.apache.hadoop.fs.TestRawLocalFileSystemContract
> [ERROR] testPermission(org.apache.hadoop.fs.TestRawLocalFileSystemContract)  
> Time elapsed: 0.296 s  <<< ERROR!
> java.lang.UnsatisfiedLinkError: 
> org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat(Ljava/lang/String;)Lorg/apache/hadoop/io/nativeio/NativeIO$POSIX$Stat;
>     at org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat(Native Method)
>     at org.apache.hadoop.io.nativeio.NativeIO$POSIX.getStat(NativeIO.java:451)
>     at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfoByNativeIO(RawLocalFileSystem.java:821)
>     at 
> org.apache.hadoop.fs.TestRawLocalFileSystemContract.testPermission(TestRawLocalFileSystemContract.java:112)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>     at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>     at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>     at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>     at 
> org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
> [INFO] Running org.apache.hadoop.fs.TestFsShellTouch
> [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.492 
> s - in org.apache.hadoop.fs.TestFsShellTouch
> [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.35 s 
> - in org.apache.hadoop.conf.TestCommonConfigurationFields
> [INFO] Running org.apache.hadoop.conf.TestConfigRedactor
> [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.182 
> s - in org.apache.hadoop.conf.TestConfigRedactor
> [INFO]
> [INFO] Results:
> [INFO]
> [ERROR] Failures:
> [ERROR]   TestCodec.testCodecPoolCompressorReinit:502->gzipReinitTest:431 
> Compressed bytes contrary to configuration
> [ERROR]   TestNativeCodeLoader.testNativeCodeLoaded:48 TestNativeCodeLoader: 
> libhadoop.so testing was required, but libhadoop.so was not loaded.
> [ERROR] Errors:
> [ERROR]   TestRawLocalFileSystemContract.testPermission:112 » UnsatisfiedLink 
> org.apache...
> [ERROR]   TestIPC.testHttpGetResponse:1074->doIpcVersionTest:1579 » Socket 
> Connection re...
> [ERROR]   TestIPC.testIpcFromHadoop_0_18_13:1056->doIpcVersionTest:1579 » 
> Socket Connect...
> [INFO]
> [ERROR] Tests run: 4130, Failures: 2, Errors: 3, Skipped: 358
> [INFO]
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Apache Hadoop Main 3.2.0-SNAPSHOT .................. SUCCESS [  1.428 
> s]
> [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  2.570 
> s]
> [INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.791 
> s]
> [INFO] Apache Hadoop Annotations .......................... SUCCESS [  4.855 
> s]
> [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  1.197 
> s]
> [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.706 
> s]
> [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  7.225 
> s]
> [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 12.220 
> s]
> [INFO] Apache Hadoop Auth ................................. SUCCESS [02:22 
> min]
> [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  4.535 
> s]
> [INFO] Apache Hadoop Common ............................... FAILURE [26:23 
> min]
> [INFO] Apache Hadoop NFS .................................. SKIPPED
> [INFO] Apache Hadoop KMS .................................. SKIPPED
> ...
> ...
> ...
> [INFO] Apache Hadoop Cloud Storage Project ................ SKIPPED
> [INFO] Apache Hadoop Ozone Acceptance Tests 3.2.0-SNAPSHOT  SKIPPED
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time: 29:27 min
> [INFO] Finished at: 2018-09-02T06:00:21-07:00
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-surefire-plugin:2.21.0:test (default-test) on 
> project hadoop-common: There are test failures.
> [ERROR]
> [ERROR] Please refer to 
> /Users/svaddi/SreeVaddi/sources/github/sreev/hadoop/hadoop-common-project/hadoop-common/target/surefire-reports
>  for the individual test results.
> [ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, 
> [date].dumpstream and [date]-jvmRun[N].dumpstream.
> [ERROR] -> [Help 1]
> [ERROR]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to