Thanks Ted

On Wed, Jul 5, 2017 at 1:42 PM Ted Yu <yuzhih...@gmail.com> wrote:

> bq. Caused by: java.net.SocketException: Too many open files
>
> Please adjust ulimit.
>
> FYI
>
> On Wed, Jul 5, 2017 at 1:33 PM, Jyotirmoy Sundi <sundi...@gmail.com>
> wrote:
>
> > Hi Folks ,
> >
> > Any idea why the build is failing in release-2.0.0 , i did "mvn clean
> > package"
> >
> >
> > *Trace*
> >
> > [INFO] Running org.apache.beam.sdk.io.hbase.HBaseResultCoderTest
> >
> > [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
> > 0.461 s - in org.apache.beam.sdk.io.hbase.HBaseResultCoderTest
> >
> > [INFO] Running org.apache.beam.sdk.io.hbase.HBaseIOTest
> >
> > [ERROR] Tests run: 17, Failures: 0, Errors: 1, Skipped: 0, Time elapsed:
> > 4.504 s <<< FAILURE! - in org.apache.beam.sdk.io.hbase.HBaseIOTest
> >
> > [ERROR] testReadingWithKeyRange(org.apache.beam.sdk.io.hbase.HBaseIOTest)
> > Time
> > elapsed: 4.504 s  <<< ERROR!
> >
> > java.lang.RuntimeException:
> >
> > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> > attempts=1, exceptions:
> >
> > Wed Jul 05 13:31:23 PDT 2017,
> > RpcRetryingCaller{globalStartTime=1499286683193, pause=100, retries=1},
> > java.net.SocketException: Too many open files
> >
> >
> > at
> > org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.
> > waitUntilFinish(DirectRunner.java:330)
> >
> > at
> > org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.
> > waitUntilFinish(DirectRunner.java:292)
> >
> > at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:200)
> >
> > at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:63)
> >
> > at org.apache.beam.sdk.Pipeline.run(Pipeline.java:295)
> >
> > at org.apache.beam.sdk.Pipeline.run(Pipeline.java:281)
> >
> > at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:340)
> >
> > at
> > org.apache.beam.sdk.io.hbase.HBaseIOTest.runReadTestLength(
> > HBaseIOTest.java:418)
> >
> > at
> > org.apache.beam.sdk.io.hbase.HBaseIOTest.testReadingWithKeyRange(
> > HBaseIOTest.java:253)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > 62)
> >
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at
> > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(
> > FrameworkMethod.java:50)
> >
> > at
> > org.junit.internal.runners.model.ReflectiveCallable.run(
> > ReflectiveCallable.java:12)
> >
> > at
> > org.junit.runners.model.FrameworkMethod.invokeExplosively(
> > FrameworkMethod.java:47)
> >
> > at
> > org.junit.internal.runners.statements.InvokeMethod.
> > evaluate(InvokeMethod.java:17)
> >
> > at
> >
> org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:321)
> >
> > at
> > org.junit.rules.ExpectedException$ExpectedExceptionStatement.
> > evaluate(ExpectedException.java:239)
> >
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> >
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> >
> > at
> > org.junit.runners.BlockJUnit4ClassRunner.runChild(
> > BlockJUnit4ClassRunner.java:78)
> >
> > at
> > org.junit.runners.BlockJUnit4ClassRunner.runChild(
> > BlockJUnit4ClassRunner.java:57)
> >
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> >
> > at
> >
> org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:393)
> >
> > at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> >
> > at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> >
> > at
> > java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >
> > at
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >
> > at java.lang.Thread.run(Thread.java:745)
> >
> > Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException:
> > Failed
> > after attempts=1, exceptions:
> >
> > Wed Jul 05 13:31:23 PDT 2017,
> > RpcRetryingCaller{globalStartTime=1499286683193, pause=100, retries=1},
> > java.net.SocketException: Too many open files
> >
> >
> > at
> > org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> > RpcRetryingCaller.java:157)
> >
> > at
> > org.apache.hadoop.hbase.client.ResultBoundedCompletionService
> > $QueueingFuture.run(ResultBoundedCompletionService.java:65)
> >
> > ... 3 more
> >
> > Caused by: java.net.SocketException: Too many open files
> >
> > at sun.nio.ch.Net.socket0(Native Method)
> >
> > at sun.nio.ch.Net.socket(Net.java:411)
> >
> > at sun.nio.ch.Net.socket(Net.java:404)
> >
> > at sun.nio.ch.SocketChannelImpl.<init>(SocketChannelImpl.java:105)
> >
> > at
> > sun.nio.ch.SelectorProviderImpl.openSocketChannel(
> > SelectorProviderImpl.java:60)
> >
> > at java.nio.channels.SocketChannel.open(SocketChannel.java:145)
> >
> > at
> > org.apache.hadoop.net.StandardSocketFactory.createSocket(
> > StandardSocketFactory.java:62)
> >
> > at
> > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > setupConnection(RpcClientImpl.java:410)
> >
> > at
> > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > setupIOstreams(RpcClientImpl.java:722)
> >
> > at
> > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > writeRequest(RpcClientImpl.java:906)
> >
> > at
> > org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(
> > RpcClientImpl.java:873)
> >
> > at
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> >
> > at
> > org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(
> > AbstractRpcClient.java:227)
> >
> > at
> > org.apache.hadoop.hbase.ipc.AbstractRpcClient$
> > BlockingRpcChannelImplementation.callBlockingMethod(
> > AbstractRpcClient.java:336)
> >
> > at
> > org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$
> > BlockingStub.scan(ClientProtos.java:34094)
> >
> > at
> > org.apache.hadoop.hbase.client.ClientSmallReversedScanner$
> > SmallReversedScannerCallable.call(ClientSmallReversedScanner.java:298)
> >
> > at
> > org.apache.hadoop.hbase.client.ClientSmallReversedScanner$
> > SmallReversedScannerCallable.call(ClientSmallReversedScanner.java:276)
> >
> > at
> > org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(
> > RpcRetryingCaller.java:210)
> >
> > at
> > org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> > RetryingRPC.call(ScannerCallableWithReplicas.java:364)
> >
> > at
> > org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> > RetryingRPC.call(ScannerCallableWithReplicas.java:338)
> >
> > at
> > org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> > RpcRetryingCaller.java:136)
> >
> > ... 4 more
> >
> >
> > [INFO] Running org.apache.beam.sdk.io.hbase.SerializableScanTest
> >
> > [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
> > 0.368 s - in org.apache.beam.sdk.io.hbase.SerializableScanTest
> >
> > [INFO] Running org.apache.beam.sdk.io.hbase.HBaseMutationCoderTest
> >
> > [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
> > 0.466 s - in org.apache.beam.sdk.io.hbase.HBaseMutationCoderTest
> >
> > [INFO]
> >
> > [INFO] Results:
> >
> > [INFO]
> >
> > [ERROR] Errors:
> >
> > [ERROR]   HBaseIOTest.testReadingWithKeyRange:253->runReadTestLength:418
> ยป
> > Runtime org.a...
> >
> > [INFO]
> >
> > [ERROR] Tests run: 21, Failures: 0, Errors: 1, Skipped: 0
> >
> > [INFO]
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] Reactor Summary:
> >
> > [INFO]
> >
> > [INFO] Apache Beam :: Parent .............................. SUCCESS [
> > 2.100
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Build Tools ......... SUCCESS [
> > 2.459
> > s]
> >
> > [INFO] Apache Beam :: SDKs ................................ SUCCESS [
> > 0.407
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Common ...................... SUCCESS [
> > 0.224
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Common :: Fn API ............ SUCCESS [
> > 5.013
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Common :: Runner API ........ SUCCESS [
> > 1.707
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java ........................ SUCCESS [
> > 0.342
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Core ................ SUCCESS [
> > 54.456 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO .................. SUCCESS [
> > 0.344
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Common ........ SUCCESS [
> > 1.860
> > s]
> >
> > [INFO] Apache Beam :: Runners ............................. SUCCESS [
> > 0.329
> > s]
> >
> > [INFO] Apache Beam :: Runners :: Core Construction Java ... SUCCESS [
> > 6.296
> > s]
> >
> > [INFO] Apache Beam :: Runners :: Core Java ................ SUCCESS [
> > 18.418 s]
> >
> > [INFO] Apache Beam :: Runners :: Direct Java .............. SUCCESS [
> > 25.190 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Elasticsearch . SUCCESS [
> > 13.994 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Extensions .......... SUCCESS [
> > 0.285
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Extensions :: Google Cloud Platform
> > Core SUCCESS [ 14.985 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Extensions :: Protobuf SUCCESS [
> > 5.334
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Google Cloud Platform
> SUCCESS [
> > 29.326 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop Common . SUCCESS [
> > 5.403
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop File System SUCCESS [
> > 14.641 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop ........ SUCCESS [
> > 1.484
> > s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop :: input-format
> SUCCESS
> > [
> > 6.105 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Hadoop :: jdk1.8-tests
> SUCCESS
> > [ 39.188 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: HBase ......... FAILURE [
> > 26.018 s]
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: JDBC .......... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: JMS ........... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Kafka ......... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: Kinesis ....... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: MongoDB ....... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: MQTT .......... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: IO :: XML ........... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes .... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes :: Starter SKIPPED
> >
> > [INFO] Apache Beam :: Examples ............................ SKIPPED
> >
> > [INFO] Apache Beam :: Examples :: Java .................... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes :: Examples
> SKIPPED
> >
> > [INFO] Apache Beam :: Examples :: Java 8 .................. SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Maven Archetypes :: Examples -
> Java 8
> > SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Extensions :: Jackson SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Extensions :: Join library SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Extensions :: Sorter  SKIPPED
> >
> > [INFO] Apache Beam :: Runners :: Google Cloud Dataflow .... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Harness ............. SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Java 8 Tests ........ SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Python ...................... SKIPPED
> >
> > [INFO] Apache Beam :: Runners :: Flink .................... SKIPPED
> >
> > [INFO] Apache Beam :: Runners :: Spark .................... SKIPPED
> >
> > [INFO] Apache Beam :: Runners :: Apex ..................... SKIPPED
> >
> > [INFO] Apache Beam :: SDKs :: Java :: Aggregated Javadoc .. SKIPPED
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] BUILD FAILURE
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] Total time: 04:36 min
> >
> > [INFO] Finished at: 2017-07-05T13:31:29-07:00
> >
> > [INFO] Final Memory: 195M/1978M
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-surefire-plugin:2.20:test (default-test)
> on
> > project beam-sdks-java-io-hbase: There are test failures.
> >
> > [ERROR]
> >
> > [ERROR] Please refer to
> > /Users/jsundi/git/beam/sdks/java/io/hbase/target/surefire-reports for the
> > individual test results.
> >
> > [ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump,
> > [date].dumpstream and [date]-jvmRun[N].dumpstream.
> >
> > [ERROR] -> [Help 1]
> >
> > [ERROR]
> >
> > [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e
> > switch.
> >
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >
> > [ERROR]
> >
> > [ERROR] For more information about the errors and possible solutions,
> > please read the following articles:
> >
> > [ERROR] [Help 1]
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> >
> > [ERROR]
> >
> > [ERROR] After correcting the problems, you can resume the build with the
> > command
> >
> > [ERROR]   mvn <goals> -rf :beam-sdks-java-io-hbase
> >
> > MTVL14996a7cf:beam jsundi$ git branch
> >
> >   master
> >
> > * release-2.0.0
> >
> > MTVL14996a7cf:beam jsundi$
> >
> > --
> > Best Regards,
> > Jyotirmoy Sundi
> >
>
-- 
Best Regards,
Jyotirmoy Sundi

Reply via email to