>> Should we make the unit test suite resilient to timeouts in individual tests? It could skip the timed-out test and continue rather than terminate. +1 on the above initiative.
On Sun, Jun 26, 2011 at 5:57 PM, Mikhail Bautin <[email protected]> wrote: > Hello, > > I am working on porting some features to HBase trunk, and it looks like > there are problems with unit tests in the trunk right now. I am getting > timeouts just like the one shown below, but the timeout does not always > happen in TestDistributedLogSplitting (e.g. I've seen it happen in > TestReplication and other classes). What seems strange to me is that a > single test timeout terminates the whole test suite. Also, apparently the > continuous integration server does not handle this situation correctly, > because the page at https://builds.apache.org/job/HBase-TRUNK/1989/ claims > that there are "no failures", which is far from truth. It looks like this > started happening at revision 1987, because when I compare > https://builds.apache.org/job/HBase-TRUNK/1987/testReport/ and > https://builds.apache.org/job/HBase-TRUNK/1986/testReport/, I see that > there are significantly fewer packages in the former (newer) than in the > latter (older) test report. > > To summarize, I think this raises the following questions: > > * Could the changes in > https://builds.apache.org/job/HBase-TRUNK/1987/introduce this problem with > unit test timeouts, and how can it be fixed? > * Should we make the unit test suite resilient to timeouts in individual > tests? It could skip the timed-out test and continue rather than terminate. > * How can we fix the continuous integration server so that it does not > report "zero failures" in this case where half of the most part of the suite > did not even run? > > Thanks, > --Mikhail > > On Sat, Jun 25, 2011 at 3:15 PM, Apache Jenkins Server < > [email protected]> wrote: > See <https://builds.apache.org/job/HBase-TRUNK/1989/changes> > > Changes: > > [tedyu] HBASE-4025 Server startup fails during startup due to failure in > loading > all table descriptors. (Subbu Iyer via Ted Yu) > > [tedyu] HBASE-4028 Hmaster crashes caused by splitting log. > (gaojinchao via Ted Yu) > > [tedyu] HBASE-4029 Inappropriate checking of Logging Mode in HRegionServer > (Akash Ashok via Ted Yu) > > [tedyu] HBASE-451 Remove HTableDescriptor from HRegionInfo > addendum that fixes TestTableMapReduce > > [tedyu] HBASE-4020 "testWritesWhileGetting" unit test needs to be fixed. > Vandana Ayyalasomayajula via Ted Yu > > [tedyu] HBASE-4013 Make ZooKeeperListener Abstract (Akash Ashok via Ted > Yu) > > ------------------------------------------ > [...truncated 3171 lines...] > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hadoop/hbase/rest/transform//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hadoop/hbase/security//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hadoop/hbase/thrift//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hadoop/hbase/thrift/generated//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hadoop/hbase/util//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hadoop/hbase/zookeeper//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hbase/tmpl/common//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hbase/tmpl/master//package-use.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/org/apache/hbase/tmpl/regionserver//package-use.html.. > .> > Building index for all the packages and classes... > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/overview-tree.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/index-all.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/deprecated-list.html.. > .> > Building index for all classes... > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/allclasses-frame.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/allclasses-noframe.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/index.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/overview-summary.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/help-doc.html.. > .> > Generating < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/site/apidocs/stylesheet.css.. > .> > 80 warnings > [WARNING] Javadoc Warnings > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HConstants.java>:445: > warning - Tag @link: reference not found: Configuration > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HConstants.java>:445: > warning - Tag @link: reference not found: Configuration > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HConstants.java>:445: > warning - Tag @link: reference not found: Configuration > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HConstants.java>:445: > warning - Tag @link: reference not found: Connection > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HConstants.java>:445: > warning - Tag @link: reference not found: Connection > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HServerAddress.java>:89: > warning - Tag @link: can't find getAddress()#getHostAddress() in > java.net.InetSocketAddress > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HServerInfo.java>:111: > warning - Tag @see: can't find getServerName() in > org.apache.hadoop.hbase.HServerInfo > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HServerInfo.java>:111: > warning - Tag @see: can't find getLoad() in > org.apache.hadoop.hbase.HServerInfo > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/ServerName.java>:42: > warning - Tag @link:illegal character: "34" in "#SERVERNAME_SEPARATOR"" > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/ServerName.java>:42: > warning - Tag @link: can't find SERVERNAME_SEPARATOR" in > org.apache.hadoop.hbase.ServerName > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/ServerName.java>:42: > warning - @ink is an unknown tag. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/ServerName.java>:42: > warning - @ink is an unknown tag. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/ServerName.java>:224: > warning - @param argument "rigth" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/TableDescriptors.java>:57: > warning - @param argument "fs" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/TableDescriptors.java>:57: > warning - @param argument "rootdir" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/catalog/CatalogTracker.java>:106: > warning - @param argument "connection" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/catalog/CatalogTracker.java>:122: > warning - @param argument "connection" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/catalog/MetaReader.java>:559: > warning - @param argument "hsi" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/mapreduce/hadoopbackport/InputSampler.java>:165: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:371: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:396: > warning - Tag @link: reference not found: Batch.Call#call(Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:432: > warning - Tag @link: reference not found: Batch.Callback#update(byte[], > byte[], Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:432: > warning - Tag @link: reference not found: Batch.Callback#update(byte[], > byte[], Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/zookeeper/RegionServerTracker.java>:46: > warning - Tag @link: can't find > expireServer(org.apache.hadoop.hbase.HServerInfo) in > org.apache.hadoop.hbase.master.ServerManager > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/filter/BitComparator.java>:52: > warning - @param argument "BitwiseOp" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/executor/RegionTransitionData.java>:115: > warning - @param argument "origin" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/io/hfile/BlockCache.java>:65: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.java>:110: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.java>:90: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/AggregateProtocol.java>:126: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:66: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:99: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:117: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:73: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:79: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:92: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:86: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/ColumnInterpreter.java>:56: > warning - @param argument "value" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/util/Bytes.java>:740: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/util/FSUtils.java>:150: > warning - @return tag cannot be used in method with void return type. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/util/FSUtils.java>:1033: > warning - @param argument "conf" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/HRegion.java>:311: > warning - Tag @see: can't find newHRegion(Path, HLog, FileSystem, > Configuration, org.apache.hadoop.hbase.HRegionInfo, FlushRequester) in > org.apache.hadoop.hbase.regionserver.HRegion > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/HRegion.java>:311: > warning - Tag @link: can't find newHRegion(Path, HLog, FileSystem, > Configuration, org.apache.hadoop.hbase.HRegionInfo, FlushRequester) in > org.apache.hadoop.hbase.regionserver.HRegion > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/HRegion.java>:1066: > warning - Tag @see: can't find internalFlushcache() in > org.apache.hadoop.hbase.regionserver.HRegion > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/HRegion.java>:2438: > warning - @param argument "lockid" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/HRegion.java>:2856: > warning - @param argument "flusher" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/RegionCoprocessorHost.java>:409: > warning - @return tag cannot be used in method with void return type. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitLogWorker.java>:552: > warning - Tag @link: reference not found: SplitLogManager.TaskFinisher > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:84: > warning - Tag @link: can't find execute(OnlineRegions) in > org.apache.hadoop.hbase.regionserver.SplitTransaction > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:84: > warning - Tag @link: can't find rollback(OnlineRegions) in > org.apache.hadoop.hbase.regionserver.SplitTransaction > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:207: > warning - Tag @see: can't find rollback(OnlineRegions) in > org.apache.hadoop.hbase.regionserver.SplitTransaction > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:146: > warning - @param argument "services" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:146: > warning - @param argument "c" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:207: > warning - Tag @link: can't find rollback(OnlineRegions) in > org.apache.hadoop.hbase.regionserver.SplitTransaction > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:207: > warning - Tag @link: can't find rollback(OnlineRegions) in > org.apache.hadoop.hbase.regionserver.SplitTransaction > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/regionserver/SplitTransaction.java>:606: > warning - @return tag cannot be used in method with void return type. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HConnection.java>:195: > warning - Tag @link: can't find getHRegionConnection(InetSocketAddress) in > org.apache.hadoop.hbase.client.HConnection > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HConnectionManager.java>:297: > warning - Tag @link: reference not found: Connection > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTable.java>:962: > warning - End Delimiter } missing for possible See Tag in comment string: > "Turns 'auto-flush' on or off. > [WARNING] <p> > [WARNING] When enabled (default), {@link Put} operations don't get > buffered/delayed > [WARNING] and are immediately executed. Failed operations are not retried. > This is > [WARNING] slower but safer. > [WARNING] <p> > [WARNING] Turning off {@link #autoFlush} means that multiple {@link Put}s > will be > [WARNING] accepted before any RPC is actually sent to do the write > operations. If the > [WARNING] application dies before pending writes get flushed to HBase, data > will be > [WARNING] lost. > [WARNING] <p> > [WARNING] When you turn {@link #autoFlush} off, you should also consider > the > [WARNING] {@link #clearBufferOnFail} option. By default, asynchronous > {@link Put) > [WARNING] requests will be retried on failure until successful. However, > this can > [WARNING] pollute the writeBuffer and slow down batching performance. > Additionally, > [WARNING] you may want to issue a number of Put requests and call > [WARNING] {@link #flushCommits()} as a barrier. In both use cases, consider > setting > [WARNING] clearBufferOnFail to true to erase the buffer after {@link > #flushCommits()} > [WARNING] has been called, regardless of success." > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:396: > warning - Tag @link: reference not found: Batch.Call#call(Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:432: > warning - Tag @link: reference not found: Batch.Callback#update(byte[], > byte[], Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:432: > warning - Tag @link: reference not found: Batch.Callback#update(byte[], > byte[], Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:396: > warning - Tag @link: reference not found: Batch.Call#call(Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:432: > warning - Tag @link: reference not found: Batch.Callback#update(byte[], > byte[], Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTableInterface.java>:432: > warning - Tag @link: reference not found: Batch.Callback#update(byte[], > byte[], Object) > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/HTablePool.java>:98: > warning - @param argument "tableFactory" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/RetriesExhaustedWithDetailsException.java>:83: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/coprocessor/AggregationClient.java>:288: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/coprocessor/AggregationClient.java>:176: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/coprocessor/AggregationClient.java>:354: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/client/coprocessor/LongColumnInterpreter.java>:38: > warning - Tag @link: reference not found: TestAggregateProtocol > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/master/HMaster.java>:281: > warning - Tag @link: can't find finishInitialization() in > org.apache.hadoop.hbase.master.HMaster > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/master/MasterFileSystem.java>:177: > warning - @return tag has no arguments. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/master/ServerManager.java>:250: > warning - @param argument "serverName" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/master/SplitLogManager.java>:120: > warning - @param argument "services" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/master/SplitLogManager.java>:120: > warning - @param argument "service" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/master/SplitLogManager.java>:218: > warning - @param argument "logDir" is not a parameter name. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/package-info.java>:367: > warning - @Override is an unknown tag. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/coprocessor/package-info.java>:367: > warning - @Override is an unknown tag. > [WARNING] < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/src/main/java/org/apache/hadoop/hbase/HConstants.java>:445: > warning - Tag @link: reference not found: Configuration > [INFO] Generating "Project License" report. > [INFO] Generating "Source Xref" report. > [INFO] Generating "RAT Report" report. > [INFO] No excludes > [INFO] > ------------------------------------------------------------------------ > [INFO] Building HBase > [INFO] task-segment: [assembly:assembly] (aggregator-style) > [INFO] > ------------------------------------------------------------------------ > [INFO] Preparing assembly:assembly > [INFO] > ------------------------------------------------------------------------ > [INFO] Building HBase > [INFO] > ------------------------------------------------------------------------ > [WARNING] DEPRECATED [tasks]: Use target instead > [INFO] [antrun:run {execution: generate}] > [WARNING] Parameter tasks is deprecated, use target instead > [INFO] Executing tasks > > main: > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > 2011-06-25 21:47:22.243:INFO::Logging to STDERR via > org.mortbay.log.StdErrLog > [INFO] Executed tasks > [INFO] [build-helper:add-source {execution: add-jspc-source}] > [INFO] Source directory: < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/jspc> added. > [INFO] [build-helper:add-source {execution: add-package-info}] > [INFO] Source directory: < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/generated-sources> > added. > [INFO] [jamon:translate {execution: default}] > [INFO] Setting property: classpath.resource.loader.class => > 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'. > [INFO] Setting property: velocimacro.messages.on => 'false'. > [INFO] Setting property: resource.loader => 'classpath'. > [INFO] Setting property: resource.manager.logwhenfound => 'false'. > [INFO] [remote-resources:process {execution: default}] > [INFO] [resources:resources {execution: default-resources}] > [INFO] Using 'UTF-8' encoding to copy filtered resources. > [INFO] Copying 1 resource > [INFO] Copying 6 resources > [INFO] Copying 3 resources > [WARNING] DEPRECATED [tasks]: Use target instead > [INFO] [antrun:run {execution: default}] > [WARNING] Parameter tasks is deprecated, use target instead > [INFO] Executing tasks > > main: > [mkdir] Created dir: < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/nativelib> > [exec] tar: hadoop-snappy-nativelibs.tar: Cannot open: No such file or > directory > [exec] tar: Error is not recoverable: exiting now > [exec] Result: 2 > [INFO] Executed tasks > [INFO] [compiler:compile {execution: default-compile}] > [INFO] Compiling 527 source files to < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/classes> > [INFO] [resources:testResources {execution: default-testResources}] > [INFO] Using 'UTF-8' encoding to copy filtered resources. > [INFO] Copying 4 resources > [INFO] Copying 3 resources > [INFO] [compiler:testCompile {execution: default-testCompile}] > [INFO] Compiling 220 source files to < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/test-classes> > [INFO] [surefire:test {execution: default-test}] > [INFO] Surefire report directory: < > https://builds.apache.org/job/HBase-TRUNK/ws/trunk/target/surefire-reports > > > > ------------------------------------------------------- > T E S T S > ------------------------------------------------------- > Running org.apache.hadoop.hbase.master.TestHMasterRPCException > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.37 sec > Running org.apache.hadoop.hbase.regionserver.TestColumnSeeking > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.58 sec > Running org.apache.hadoop.hbase.client.TestMultipleTimestamps > Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.202 sec > Running org.apache.hadoop.hbase.regionserver.TestMemStoreLAB > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec > Running org.apache.hadoop.hbase.coprocessor.TestRegionObserverInterface > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.569 sec > Running org.apache.hadoop.hbase.coprocessor.TestCoprocessorInterface > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.375 sec > Running org.apache.hadoop.hbase.TestZooKeeper > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 95.28 sec > Running org.apache.hadoop.hbase.regionserver.wal.TestLogRolling > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 123.817 sec > Running org.apache.hadoop.hbase.io.hfile.TestCachedBlockQueue > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.017 sec > Running org.apache.hadoop.hbase.filter.TestPrefixFilter > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.018 sec > Running org.apache.hadoop.hbase.io.TestImmutableBytesWritable > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.017 sec > Running org.apache.hadoop.hbase.io.TestHeapSize > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.034 sec > Running org.apache.hadoop.hbase.rest.model.TestRowModel > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.036 sec > Running org.apache.hadoop.hbase.io.hfile.TestHFileSeek > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.125 sec > Running org.apache.hadoop.hbase.regionserver.TestStore > Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.72 sec > Running org.apache.hadoop.hbase.regionserver.TestSplitTransactionOnCluster > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.298 sec > Running org.apache.hadoop.hbase.zookeeper.TestHQuorumPeer > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.418 sec > Running org.apache.hadoop.hbase.rest.TestScannersWithFilters > Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.081 sec > Running org.apache.hadoop.hbase.io.hfile.TestHFilePerformance > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.199 sec > Running org.apache.hadoop.hbase.regionserver.TestResettingCounters > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.272 sec > Running org.apache.hadoop.hbase.regionserver.handler.TestOpenRegionHandler > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.461 sec > Running org.apache.hadoop.hbase.io.TestHbaseObjectWritable > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.075 sec > Running org.apache.hadoop.hbase.regionserver.wal.TestHLogSplit > Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 216.681 > sec > Running org.apache.hadoop.hbase.thrift.TestThriftServer > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.406 sec > Running org.apache.hadoop.hbase.master.TestDistributedLogSplitting > killed. > [INFO] > ------------------------------------------------------------------------ > [ERROR] BUILD ERROR > [INFO] > ------------------------------------------------------------------------ > [INFO] Error while executing forked tests.; nested exception is > org.apache.maven.surefire.booter.shade.org.codehaus.plexus.util.cli.CommandLineException: > Error while executing external command, process killed. > > Process timeout out after 900 seconds > [INFO] > ------------------------------------------------------------------------ > [INFO] For more information, run Maven with the -e switch > [INFO] > ------------------------------------------------------------------------ > [INFO] Total time: 29 minutes 47 seconds > [INFO] Finished at: Sat Jun 25 22:15:26 UTC 2011 > [INFO] Final Memory: 113M/1061M > [INFO] > ------------------------------------------------------------------------ > [locks-and-latches] Releasing all the locks > [locks-and-latches] All the locks released > Archiving artifacts > Recording test results > > >
