Yeah (12x the processors, 8x RAM, 12x the disks...). Lets see how patch
does one jenkins... (HBASE-18606).
S

On Tue, Oct 3, 2017 at 11:48 AM, Sean Busbey <bus...@apache.org> wrote:

> the linux box roughly as capable as your laptop?
>
> On Tue, Oct 3, 2017 at 12:51 PM, Stack <st...@duboce.net> wrote:
> > Tests passed eventually for me:
> >
> > real 47m6.498s
> > user 5m29.671s
> > sys 0m41.885s
> >
> > ... which is a big diff from macosx run.
> >
> > Need to look into this.
> >
> > St.Ack
> >
> >
> > On Tue, Oct 3, 2017 at 10:03 AM, Stack <st...@duboce.net> wrote:
> >
> >> The below gets us further but now I see that the spark tests take a
> really
> >> long time to run on linux but complete promptly on macosx (2m 55s).
> >> Looking....
> >>
> >> St.Ack
> >>
> >> On Tue, Oct 3, 2017 at 9:13 AM, Stack <st...@duboce.net> wrote:
> >>
> >>> This seems to work for me. Does it work for you?
> >>>
> >>>
> >>> diff --git a/hbase-spark/pom.xml b/hbase-spark/pom.xml
> >>> index 594aa2a..6d191e3 100644
> >>> --- a/hbase-spark/pom.xml
> >>> +++ b/hbase-spark/pom.xml
> >>> @@ -568,6 +568,9 @@
> >>>            <junitxml>.</junitxml>
> >>>            <filereports>WDF TestSuite.txt</filereports>
> >>>            <parallel>false</parallel>
> >>> +          <systemProperties>
> >>> +            <org.apache.hadoop.hbase.shaded.io.netty.packagePrefix>
> org.
> >>> apache.hadoop.hbase.shaded.</org.apache.hadoop.hbase.
> >>> shaded.io.netty.packagePrefix>
> >>> +          </systemProperties>
> >>>          </configuration>
> >>>          <executions>
> >>>            <execution>
> >>>
> >>> St.Ack
> >>>
> >>> On Tue, Oct 3, 2017 at 8:45 AM, Amit Kabra <amitkabrai...@gmail.com>
> >>> wrote:
> >>>
> >>>> Thanks Stack / Sean Busbey for replying.
> >>>>
> >>>> OS : Ubuntu 16.04.2 , 64 bit.
> >>>> Eclipse : Version: Neon.3 Release (4.6.3)
> >>>> HBase branch : branch-2
> >>>> Command line test to reproduce : mvn clean package
> >>>> -Dtest=TestIncrementalBackup
> >>>> Reproduce from eclipse , right click on TestIncBackupRestore and run
> as
> >>>> junit from test class TestIncrementalBackup.
> >>>> No I am not embedding hbase in my application. I have just checked out
> >>>> hbase , switched to branch-2 and run the unit test from command line
> or
> >>>> from eclipse. Failing with same error in both cases.
> >>>> Yes the trailing period is also present.
> >>>>
> >>>> Thanks,
> >>>> Amit Kabra.
> >>>>
> >>>>
> >>>>
> >>>>
> >>>>
> >>>> On Tue, Oct 3, 2017 at 8:53 PM, Stack <st...@duboce.net> wrote:
> >>>>
> >>>> > Thank you for the detail.
> >>>> >
> >>>> > Pardon the questions below asking for yet more detail. I am unable
> to
> >>>> > reproduce locally or on another os (though we see this issue up on
> our
> >>>> > build box).
> >>>> >
> >>>> > What is your OS when you see the below?
> >>>> >
> >>>> > On Tue, Oct 3, 2017 at 2:06 AM, Amit Kabra <amitkabrai...@gmail.com
> >
> >>>> > wrote:
> >>>> >
> >>>> > > Hello,
> >>>> > >
> >>>> > > I am using "branch-2" branch of hbase, when I run unit test I get
> >>>> > following
> >>>> > > error for netty "java.lang.UnsatisfiedLinkError: failed to load
> the
> >>>> > > required native library"
> >>>> > >
> >>>> > >
> >>>> > > This is running a unit test in your eclipse environment?
> >>>> >
> >>>> > You are trying to run an hbase-spark unit test when you see the
> above?
> >>>> >
> >>>> >
> >>>> >
> >>>> >
> >>>> > > *I already have following set in "maven-surefire-plugin" in
> pom.xml
> >>>> as
> >>>> > > per http://hbase.apache.org/book.html#thirdparty
> >>>> > > <http://hbase.apache.org/book.html#thirdparty>*
> >>>> > >
> >>>> > >
> >>>> > >
> >>>> >
> >>>> > Are you embedding hbase into your application?
> >>>> >
> >>>> >
> >>>> >
> >>>> > >             <systemPropertyVariables>
> >>>> > >                 <!--
> >>>> > >               <test.build.classes>${test.
> build.classes}</test.build.
> >>>> > classe
> >>>> > > s>
> >>>> > >                 -->
> >>>> > >               <!--For shaded netty, to find the relocated .so.
> >>>> > >                    Trick from
> >>>> > >                 https://stackoverflow.com/
> questions/33825743/rename-
> >>>> > > files-inside-a-jar-using-some-maven-plugin
> >>>> > > <https://stackoverflow.com/questions/33825743/rename-
> >>>> > files-inside-a-jar-using-some-maven-plugin>
> >>>> > >
> >>>> > >                 The netty jar has a .so in it. Shading requires
> >>>> rename of
> >>>> > > the .so and then passing a system
> >>>> > >                 property so netty finds the renamed .so and
> >>>> associates it
> >>>> > > w/ the relocated netty files.
> >>>> > >
> >>>> > >                 The relocated netty is in hbase-thirdparty
> >>>> dependency.
> >>>> > Just
> >>>> > > set this propery globally rather
> >>>> > >                 than per module.
> >>>> > >                -->
> >>>> > >               <org.apache.hadoop.hbase.shad
> >>>> ed.io.netty.packagePrefix>
> >>>> > > org.apache.hadoop.hbase.shaded.</org.apache.hadoop.
> >>>> > hbase.shaded.io.netty.
> >>>> > > packagePrefix>
> >>>> > >             </systemPropertyVariables>
> >>>> > >
> >>>> > >
> >>>> > >
> >>>> > > *And I see in the code as per HBASE-18271, all io.netty is already
> >>>> > replaced
> >>>> > > with org.apache.hadoop.hbase.shaded.io.netty*
> >>>> > >
> >>>> > >
> >>>> > The trailing period is also present?
> >>>> >
> >>>> >
> >>>> >
> >>>> > >
> >>>> > > If I run a test from eclipse , I see the error immediately and my
> >>>> test
> >>>> > > doesn't run, but when I run from command line , the test runs but
> I
> >>>> get
> >>>> > the
> >>>> > > error at the end when the mvn command finishes.
> >>>> > >
> >>>> > >
> >>>> > > Is it any eclipse test?
> >>>> >
> >>>> > Thank you. Let me try and fix this this morning.
> >>>> >
> >>>> > S
> >>>> >
> >>>> >
> >>>> >
> >>>> >
> >>>> >
> >>>> > > *Here is the complete error output.*
> >>>> > >
> >>>> > >
> >>>> > >
> >>>> > > [INFO]
> >>>> > > [INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @
> >>>> > hbase-spark
> >>>> > > ---
> >>>> > > [INFO]
> >>>> > > [INFO] --- scalatest-maven-plugin:1.0:test (test) @ hbase-spark
> ---
> >>>> > > Discovery starting.
> >>>> > > Discovery completed in 1 second, 558 milliseconds.
> >>>> > > Run starting. Expected test count is: 79
> >>>> > > HBaseDStreamFunctionsSuite:
> >>>> > > Formatting using clusterid: testClusterID
> >>>> > > *** RUN ABORTED ***
> >>>> > >   java.io.IOException: Shutting down
> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.init(
> >>>> > > MiniHBaseCluster.java:232)
> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(
> >>>> > > MiniHBaseCluster.java:94)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCl
> >>>> uster(
> >>>> > > HBaseTestingUtility.java:1124)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:1078)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:949)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:943)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:872)
> >>>> > >   at org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.bef
> >>>> oreAll(
> >>>> > > HBaseDStreamFunctionsSuite.scala:41)
> >>>> > >   at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAft
> >>>> erAll.
> >>>> > > scala:187)
> >>>> > >   at org.apache.hadoop.hbase.spark.HBaseDStreamFunctionsSuite.bef
> >>>> oreAll(
> >>>> > > HBaseDStreamFunctionsSuite.scala:30)
> >>>> > >   ...
> >>>> > >   Cause: java.lang.RuntimeException: Failed construction of
> Master:
> >>>> class
> >>>> > > org.apache.hadoop.hbase.master.HMasterorg.apache.
> >>>> > > hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > NativeStaticallyReferencedJniM
> >>>> > > ethods.epollin()I
> >>>> > >   at org.apache.hadoop.hbase.util.JVMClusterUtil.
> createMasterThread(
> >>>> > > JVMClusterUtil.java:145)
> >>>> > >   at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(
> >>>> > > LocalHBaseCluster.java:217)
> >>>> > >   at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(
> >>>> > > LocalHBaseCluster.java:152)
> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.init(
> >>>> > > MiniHBaseCluster.java:214)
> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(
> >>>> > > MiniHBaseCluster.java:94)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCl
> >>>> uster(
> >>>> > > HBaseTestingUtility.java:1124)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:1078)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:949)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:943)
> >>>> > >   at org.apache.hadoop.hbase.HBaseTestingUtility.
> startMiniCluster(
> >>>> > > HBaseTestingUtility.java:872)
> >>>> > >   ...
> >>>> > >   Cause: java.lang.UnsatisfiedLinkError: failed to load the
> required
> >>>> > > native
> >>>> > > library
> >>>> > >   at org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > > Epoll.ensureAvailability(Epoll.java:78)
> >>>> > >   at org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > > EpollEventLoopGroup.<clinit>(EpollEventLoopGroup.java:38)
> >>>> > >   at org.apache.hadoop.hbase.util.NettyEventLoopGroupConfig.<
> init>(
> >>>> > > NettyEventLoopGroupConfig.java:61)
> >>>> > >   at org.apache.hadoop.hbase.regionserver.HRegionServer.<
> >>>> > > init>(HRegionServer.java:552)
> >>>> > >   at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.
> java:475)
> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.
> newInstance0(Native
> >>>> > Method)
> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> >>>> > > NativeConstructorAccessorImpl.java:62)
> >>>> > >   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> >>>> > > DelegatingConstructorAccessorImpl.java:45)
> >>>> > >   at java.lang.reflect.Constructor.newInstance(Constructor.java:
> 423)
> >>>> > >   at org.apache.hadoop.hbase.util.JVMClusterUtil.
> createMasterThread(
> >>>> > > JVMClusterUtil.java:140)
> >>>> > >   ...
> >>>> > >   Cause: java.lang.UnsatisfiedLinkError: org.apache.hadoop.hbase.
> >>>> > > shaded.io.netty.channel.epoll.NativeStaticallyReferencedJniM
> >>>> > > ethods.epollin()I
> >>>> > >   at org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > > NativeStaticallyReferencedJniMethods.epollin(Native Method)
> >>>> > >   at org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > > Native.<clinit>(Native.java:66)
> >>>> > >   at org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > > Epoll.<clinit>(Epoll.java:33)
> >>>> > >   at org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> >>>> > > EpollEventLoopGroup.<clinit>(EpollEventLoopGroup.java:38)
> >>>> > >   at org.apache.hadoop.hbase.util.NettyEventLoopGroupConfig.<
> init>(
> >>>> > > NettyEventLoopGroupConfig.java:61)
> >>>> > >   at org.apache.hadoop.hbase.regionserver.HRegionServer.<
> >>>> > > init>(HRegionServer.java:552)
> >>>> > >   at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.
> java:475)
> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.
> newInstance0(Native
> >>>> > Method)
> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> >>>> > > NativeConstructorAccessorImpl.java:62)
> >>>> > >   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> >>>> > > DelegatingConstructorAccessorImpl.java:45)
> >>>> > >   ...
> >>>> > > [INFO] ------------------------------
> ------------------------------
> >>>> > > ------------
> >>>> > > [INFO] Reactor Summary:
> >>>> > > [INFO]
> >>>> > > [INFO] Apache HBase .......................................
> SUCCESS
> >>>> [
> >>>> > > 1.575 s]
> >>>> > > [INFO] Apache HBase - Checkstyle ..........................
> SUCCESS [
> >>>> > > 0.317 s]
> >>>> > > [INFO] Apache HBase - Annotations .........................
> SUCCESS [
> >>>> > > 0.537 s]
> >>>> > > [INFO] Apache HBase - Build Configuration .................
> SUCCESS [
> >>>> > > 0.053 s]
> >>>> > > [INFO] Apache HBase - Shaded Protocol .....................
> SUCCESS [
> >>>> > > 15.410 s]
> >>>> > > [INFO] Apache HBase - Common ..............................
> SUCCESS [
> >>>> > > 4.603 s]
> >>>> > > [INFO] Apache HBase - Metrics API .........................
> SUCCESS [
> >>>> > > 1.213 s]
> >>>> > > [INFO] Apache HBase - Hadoop Compatibility ................
> SUCCESS [
> >>>> > > 0.985 s]
> >>>> > > [INFO] Apache HBase - Metrics Implementation ..............
> SUCCESS [
> >>>> > > 0.863 s]
> >>>> > > [INFO] Apache HBase - Hadoop Two Compatibility ............
> SUCCESS [
> >>>> > > 1.750 s]
> >>>> > > [INFO] Apache HBase - Protocol ............................
> SUCCESS [
> >>>> > > 4.880 s]
> >>>> > > [INFO] Apache HBase - Client ..............................
> SUCCESS [
> >>>> > > 5.233 s]
> >>>> > > [INFO] Apache HBase - Replication .........................
> SUCCESS [
> >>>> > > 1.040 s]
> >>>> > > [INFO] Apache HBase - Prefix Tree .........................
> SUCCESS [
> >>>> > > 1.121 s]
> >>>> > > [INFO] Apache HBase - Procedure ...........................
> SUCCESS [
> >>>> > > 1.084 s]
> >>>> > > [INFO] Apache HBase - Resource Bundle .....................
> SUCCESS [
> >>>> > > 0.092 s]
> >>>> > > [INFO] Apache HBase - Server ..............................
> SUCCESS [
> >>>> > > 19.849 s]
> >>>> > > [INFO] Apache HBase - MapReduce ...........................
> SUCCESS [
> >>>> > > 4.221 s]
> >>>> > > [INFO] Apache HBase - Testing Util ........................
> SUCCESS [
> >>>> > > 3.273 s]
> >>>> > > [INFO] Apache HBase - Thrift ..............................
> SUCCESS [
> >>>> > > 5.519 s]
> >>>> > > [INFO] Apache HBase - RSGroup .............................
> SUCCESS [
> >>>> > > 3.408 s]
> >>>> > > [INFO] Apache HBase - Shell ...............................
> SUCCESS
> >>>> [
> >>>> > > 3.859 s]
> >>>> > > [INFO] Apache HBase - Coprocessor Endpoint ................
> SUCCESS [
> >>>> > > 4.038 s]
> >>>> > > [INFO] Apache HBase - Backup ..............................
> SUCCESS
> >>>> > [01:13
> >>>> > > min]
> >>>> > > [INFO] Apache HBase - Integration Tests ...................
> SUCCESS [
> >>>> > > 4.229 s]
> >>>> > > [INFO] Apache HBase - Examples ............................
> SUCCESS [
> >>>> > > 3.471 s]
> >>>> > > [INFO] Apache HBase - Rest ................................
> SUCCESS
> >>>> [
> >>>> > > 4.448 s]
> >>>> > > [INFO] Apache HBase - External Block Cache ................
> SUCCESS [
> >>>> > > 2.040 s]
> >>>> > > [INFO] Apache HBase - Spark ...............................
> FAILURE
> >>>> [
> >>>> > > 32.833 s]
> >>>> > > [INFO] Apache HBase - Spark Integration Tests .............
> SKIPPED
> >>>> > > [INFO] Apache HBase - Assembly ............................
> SKIPPED
> >>>> > > [INFO] Apache HBase - Shaded ..............................
> SKIPPED
> >>>> > > [INFO] Apache HBase - Shaded - Client .....................
> SKIPPED
> >>>> > > [INFO] Apache HBase - Shaded - MapReduce ..................
> SKIPPED
> >>>> > > [INFO] Apache HBase Shaded Packaging Invariants ...........
> SKIPPED
> >>>> > > [INFO] Apache HBase - Archetypes ..........................
> SKIPPED
> >>>> > > [INFO] Apache HBase - Exemplar for hbase-client archetype .
> SKIPPED
> >>>> > > [INFO] Apache HBase - Exemplar for hbase-shaded-client archetype
> >>>> SKIPPED
> >>>> > > [INFO] Apache HBase - Archetype builder ...................
> SKIPPED
> >>>> > > [INFO] ------------------------------
> ------------------------------
> >>>> > > ------------
> >>>> > > [INFO] BUILD FAILURE
> >>>> > > [INFO] ------------------------------
> ------------------------------
> >>>> > > ------------
> >>>> > > [INFO] Total time: 03:26 min
> >>>> > > [INFO] Finished at: 2017-09-27T19:34:35+05:30
> >>>> > > [INFO] Final Memory: 345M/6055M
> >>>> > > [INFO] ------------------------------
> ------------------------------
> >>>> > > ------------
> >>>> > > [ERROR] Failed to execute goal org.scalatest:scalatest-maven-
> >>>> > > plugin:1.0:test
> >>>> > > (test) on project hbase-spark: There are test failures -> [Help 1]
> >>>> > > [ERROR]
> >>>> > > [ERROR] To see the full stack trace of the errors, re-run Maven
> with
> >>>> the
> >>>> > -e
> >>>> > > switch.
> >>>> > > [ERROR] Re-run Maven using the -X switch to enable full debug
> >>>> logging.
> >>>> > > [ERROR]
> >>>> > > [ERROR] For more information about the errors and possible
> solutions,
> >>>> > > please read the following articles:
> >>>> > > [ERROR] [Help 1] http://cwiki.apache.org/
> confluence/display/MAVEN/
> >>>> > > MojoFailureException
> >>>> > > [ERROR]
> >>>> > > [ERROR] After correcting the problems, you can resume the build
> with
> >>>> the
> >>>> > > command
> >>>> > > [ERROR]   mvn <goals> -rf :hbase-spark
> >>>> > >
> >>>> >
> >>>>
> >>>
> >>>
> >>
>

Reply via email to