----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/34172/#review83637 -----------------------------------------------------------
Ship it! Ship It! - Dmitro Lisnichenko On May 13, 2015, 5:34 p.m., Andrew Onischuk wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/34172/ > ----------------------------------------------------------- > > (Updated May 13, 2015, 5:34 p.m.) > > > Review request for Ambari and Dmitro Lisnichenko. > > > Bugs: AMBARI-11112 > https://issues.apache.org/jira/browse/AMBARI-11112 > > > Repository: ambari > > > Description > ------- > > **STR** > Install 3 node cluster with HDFS and Zookeeper. > Add HBase Service > > **Result** > HBase service check failed first time after adding service. > This fails request for adding services. > > > > > 2015-05-08 12:40:20,348 - Error while executing command 'service_check': > Traceback (most recent call last): > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", > line 214, in execute > method(env) > File > "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/service_check.py", > line 92, in service_check > logoutput = True > File > "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line > 148, in __init__ > self.env.run() > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 152, in run > self.run_action(resource, action) > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 118, in run_action > provider_action() > File > "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", > line 269, in action_run > raise ex > Fail: Execution of ' /var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh > /etc/hbase/conf ida8c06740_date380815 > /usr/hdp/current/hbase-client/bin/hbase' returned 1. SLF4J: Class path > contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.4130 seconds > > Looking for ida8c06740_date380815 > stdout: /var/lib/ambari-agent/data/output-41.txt > > 2015-05-08 12:38:59,342 - > File['/var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh'] {'content': > StaticFile('hbaseSmokeVerify.sh'), 'mode': 0755} > 2015-05-08 12:38:59,530 - Writing > File['/var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh'] because it doesn't > exist > 2015-05-08 12:38:59,664 - Changing permission for > /var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh from 644 to 755 > 2015-05-08 12:38:59,718 - > File['/var/lib/ambari-agent/data/tmp/hbase-smoke.sh'] {'content': > Template('hbase-smoke.sh.j2'), 'mode': 0755} > 2015-05-08 12:38:59,849 - Writing > File['/var/lib/ambari-agent/data/tmp/hbase-smoke.sh'] because it doesn't exist > 2015-05-08 12:38:59,984 - Changing permission for > /var/lib/ambari-agent/data/tmp/hbase-smoke.sh from 644 to 755 > 2015-05-08 12:39:00,028 - Execute[' > /usr/hdp/current/hbase-client/bin/hbase --config /etc/hbase/conf shell > /var/lib/ambari-agent/data/tmp/hbase-smoke.sh'] {'logoutput': True, 'tries': > 3, 'user': 'ambari-qa', 'try_sleep': 5} > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > > ERROR: Table ambarismoketest does not exist. > > Here is some help for this command: > Start disable of named table: > hbase> disable 't1' > hbase> disable 'ns1:t1' > > > > ERROR: Table ambarismoketest does not exist. > > Here is some help for this command: > Drop the named table. Table must first be disabled: > hbase> drop 't1' > hbase> drop 'ns1:t1' > > > 0 row(s) in 2.3070 seconds > > 2015-05-08 12:39:24,781 ERROR [main] client.AsyncProcess: Failed to get > region location > org.apache.hadoop.hbase.client.NoServerForRegionException: No server > address listed in hbase:meta for region > ambarismoketest,,1431088757938.1ef95b6ae9a68168cbe2aa482b6026fd. containing > row row01 > at > org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1292) > at > org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1155) > at > org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:370) > at > org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:321) > at > org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:206) > at > org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183) > at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1421) > at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1014) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:497) > at > org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:450) > at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:311) > at > org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:59) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169) > at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at org.jruby.ast.BlockNode.interpret(BlockNode.java:71) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71) > at org.jruby.ast.CallManyArgsNode.interpret(CallManyArgsNode.java:59) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:111) > at > org.jruby.runtime.InterpretedBlock.evalBlockBody(InterpretedBlock.java:374) > at org.jruby.runtime.InterpretedBlock.yield(InterpretedBlock.java:295) > at > org.jruby.runtime.InterpretedBlock.yieldSpecific(InterpretedBlock.java:229) > at org.jruby.runtime.Block.yieldSpecific(Block.java:99) > at org.jruby.ast.ZYieldNode.interpret(ZYieldNode.java:25) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at org.jruby.ast.BlockNode.interpret(BlockNode.java:71) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:169) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:191) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:302) > at > org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:144) > at > org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:153) > at > org.jruby.ast.FCallNoArgBlockNode.interpret(FCallNoArgBlockNode.java:32) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71) > at org.jruby.ast.FCallManyArgsNode.interpret(FCallManyArgsNode.java:60) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:165) > at org.jruby.RubyClass.finvoke(RubyClass.java:573) > at org.jruby.RubyBasicObject.send(RubyBasicObject.java:2801) > at org.jruby.RubyKernel.send(RubyKernel.java:2117) > at org.jruby.RubyKernel$s$send.call(RubyKernel$s$send.gen:65535) > at > org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:181) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71) > at > org.jruby.ast.FCallSpecialArgNode.interpret(FCallSpecialArgNode.java:45) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:111) > at > org.jruby.runtime.InterpretedBlock.evalBlockBody(InterpretedBlock.java:374) > at org.jruby.runtime.InterpretedBlock.yield(InterpretedBlock.java:295) > at > org.jruby.runtime.InterpretedBlock.yieldSpecific(InterpretedBlock.java:229) > at org.jruby.runtime.Block.yieldSpecific(Block.java:99) > at org.jruby.ast.ZYieldNode.interpret(ZYieldNode.java:25) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at org.jruby.ast.RescueNode.executeBody(RescueNode.java:216) > at > org.jruby.ast.RescueNode.interpretWithJavaExceptions(RescueNode.java:120) > at org.jruby.ast.RescueNode.interpret(RescueNode.java:110) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:165) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:272) > at > org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:80) > at > org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:89) > at > org.jruby.ast.FCallSpecialArgBlockNode.interpret(FCallSpecialArgBlockNode.java:42) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at org.jruby.ast.RescueNode.executeBody(RescueNode.java:216) > at > org.jruby.ast.RescueNode.interpretWithJavaExceptions(RescueNode.java:120) > at org.jruby.ast.RescueNode.interpret(RescueNode.java:110) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71) > at > org.jruby.ast.CallSpecialArgNode.interpret(CallSpecialArgNode.java:73) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:69) > at > org.jruby.ast.FCallSpecialArgNode.interpret(FCallSpecialArgNode.java:45) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71) > at > org.jruby.ast.CallSpecialArgNode.interpret(CallSpecialArgNode.java:73) > at org.jruby.ast.LocalAsgnNode.interpret(LocalAsgnNode.java:123) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at org.jruby.ast.BlockNode.interpret(BlockNode.java:71) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:120) > at > org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:134) > at > org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:174) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:282) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:71) > at org.jruby.ast.FCallManyArgsNode.interpret(FCallManyArgsNode.java:60) > at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) > at org.jruby.ast.BlockNode.interpret(BlockNode.java:71) > at org.jruby.ast.RootNode.interpret(RootNode.java:129) > at > org.jruby.evaluator.ASTInterpreter.INTERPRET_ROOT(ASTInterpreter.java:119) > at org.jruby.Ruby.runInterpreter(Ruby.java:724) > at org.jruby.Ruby.loadFile(Ruby.java:2489) > at org.jruby.runtime.load.ExternalScript.load(ExternalScript.java:66) > at org.jruby.runtime.load.LoadService.load(LoadService.java:270) > at org.jruby.RubyKernel.loadCommon(RubyKernel.java:1105) > at org.jruby.RubyKernel.load(RubyKernel.java:1087) > at org.jruby.RubyKernel$s$0$1$load.call(RubyKernel$s$0$1$load.gen:65535) > at > org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:211) > at > org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:207) > at > org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312) > at > org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169) > at > usr.hdp.$2_dot_3_dot_0_dot_0_minus_1865.hbase.bin.hirb.__file__(/usr/hdp/2.3.0.0-1865/hbase/bin/hirb.rb:163) > at > usr.hdp.$2_dot_3_dot_0_dot_0_minus_1865.hbase.bin.hirb.load(/usr/hdp/2.3.0.0-1865/hbase/bin/hirb.rb) > at org.jruby.Ruby.runScript(Ruby.java:697) > at org.jruby.Ruby.runScript(Ruby.java:690) > at org.jruby.Ruby.runNormally(Ruby.java:597) > at org.jruby.Ruby.runFromMain(Ruby.java:446) > at org.jruby.Main.doRunFromMain(Main.java:369) > at org.jruby.Main.internalRun(Main.java:258) > at org.jruby.Main.run(Main.java:224) > at org.jruby.Main.run(Main.java:208) > at org.jruby.Main.main(Main.java:188) > > ERROR: Failed 1 action: No server address listed in hbase:meta for region > ambarismoketest,,1431088757938.1ef95b6ae9a68168cbe2aa482b6026fd. containing > row row01: 1 time, > > Here is some help for this command: > Put a cell 'value' at specified table/row/column and optionally > timestamp coordinates. To put a cell value into table 'ns1:t1' or 't1' > at row 'r1' under column 'c1' marked with the time 'ts1', do: > > hbase> put 'ns1:t1', 'r1', 'c1', 'value' > hbase> put 't1', 'r1', 'c1', 'value' > hbase> put 't1', 'r1', 'c1', 'value', ts1 > hbase> put 't1', 'r1', 'c1', 'value', {ATTRIBUTES=>{'mykey'=>'myvalue'}} > hbase> put 't1', 'r1', 'c1', 'value', ts1, > {ATTRIBUTES=>{'mykey'=>'myvalue'}} > hbase> put 't1', 'r1', 'c1', 'value', ts1, > {VISIBILITY=>'PRIVATE|SECRET'} > > The same commands also can be run on a table reference. Suppose you had a > reference > t to table 't1', the corresponding command would be: > > hbase> t.put 'r1', 'c1', 'value', ts1, > {ATTRIBUTES=>{'mykey'=>'myvalue'}} > > > ROW COLUMN+CELL > > > ERROR: No server address listed in hbase:meta for region > ambarismoketest,,1431088757938.1ef95b6ae9a68168cbe2aa482b6026fd. containing > row > > Here is some help for this command: > Scan a table; pass table name and optionally a dictionary of scanner > specifications. Scanner specifications may include one or more of: > TIMERANGE, FILTER, LIMIT, STARTROW, STOPROW, ROWPREFIXFILTER, TIMESTAMP, > MAXLENGTH or COLUMNS, CACHE or RAW, VERSIONS > > If no columns are specified, all columns will be scanned. > To scan all members of a column family, leave the qualifier empty as in > 'col_family:'. > > The filter can be specified in two ways: > 1. Using a filterString - more information on this is available in the > Filter Language document attached to the HBASE-4176 JIRA > 2. Using the entire package name of the filter. > > Some examples: > > hbase> scan 'hbase:meta' > hbase> scan 'hbase:meta', {COLUMNS => 'info:regioninfo'} > hbase> scan 'ns1:t1', {COLUMNS => ['c1', 'c2'], LIMIT => 10, STARTROW > => 'xyz'} > hbase> scan 't1', {COLUMNS => ['c1', 'c2'], LIMIT => 10, STARTROW => > 'xyz'} > hbase> scan 't1', {COLUMNS => 'c1', TIMERANGE => [1303668804, > 1303668904]} > hbase> scan 't1', {REVERSED => true} > hbase> scan 't1', {ROWPREFIXFILTER => 'row2', FILTER => " > (QualifierFilter (>=, 'binary:xyz')) AND (TimestampsFilter ( 123, > 456))"} > hbase> scan 't1', {FILTER => > org.apache.hadoop.hbase.filter.ColumnPaginationFilter.new(1, 0)} > hbase> scan 't1', {CONSISTENCY => 'TIMELINE'} > For setting the Operation Attributes > hbase> scan 't1', { COLUMNS => ['c1', 'c2'], ATTRIBUTES => {'mykey' => > 'myvalue'}} > hbase> scan 't1', { COLUMNS => ['c1', 'c2'], AUTHORIZATIONS => > ['PRIVATE','SECRET']} > For experts, there is an additional option -- CACHE_BLOCKS -- which > switches block caching for the scanner on (true) or off (false). By > default it is enabled. Examples: > > hbase> scan 't1', {COLUMNS => ['c1', 'c2'], CACHE_BLOCKS => false} > > Also for experts, there is an advanced option -- RAW -- which instructs > the > scanner to return all cells (including delete markers and uncollected > deleted > cells). This option cannot be combined with requesting specific COLUMNS. > Disabled by default. Example: > > hbase> scan 't1', {RAW => true, VERSIONS => 10} > > Besides the default 'toStringBinary' format, 'scan' supports custom > formatting > by column. A user can define a FORMATTER by adding it to the column name > in > the scan specification. The FORMATTER can be stipulated: > > 1. either as a org.apache.hadoop.hbase.util.Bytes method name (e.g, > toInt, toString) > 2. or as a custom class followed by method name: e.g. > 'c(MyFormatterClass).format'. > > Example formatting cf:qualifier1 and cf:qualifier2 both as Integers: > hbase> scan 't1', {COLUMNS => ['cf:qualifier1:toInt', > 'cf:qualifier2:c(org.apache.hadoop.hbase.util.Bytes).toInt'] } > > Note that you can specify a FORMATTER by column only (cf:qualifier). You > cannot > specify a FORMATTER for all columns of a column family. > > Scan can also be used directly from a table, by first getting a reference > to a > table, like such: > > hbase> t = get_table 't' > hbase> t.scan > > Note in the above situation, you can still provide all the filtering, > columns, > options, etc as described above. > > > > 2015-05-08 12:39:29,616 - Execute[' > /var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh /etc/hbase/conf > ida8c06740_date380815 /usr/hdp/current/hbase-client/bin/hbase'] {'logoutput': > True, 'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5} > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.4750 seconds > > Looking for ida8c06740_date380815 > 2015-05-08 12:39:44,718 - Retrying after 5 seconds. Reason: Execution of > ' /var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh /etc/hbase/conf > ida8c06740_date380815 /usr/hdp/current/hbase-client/bin/hbase' returned 1. > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.4750 seconds > > Looking for ida8c06740_date380815 > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.6070 seconds > > Looking for ida8c06740_date380815 > 2015-05-08 12:40:02,953 - Retrying after 5 seconds. Reason: Execution of > ' /var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh /etc/hbase/conf > ida8c06740_date380815 /usr/hdp/current/hbase-client/bin/hbase' returned 1. > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.6070 seconds > > Looking for ida8c06740_date380815 > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.4130 seconds > > Looking for ida8c06740_date380815 > 2015-05-08 12:40:20,348 - Error while executing command 'service_check': > Traceback (most recent call last): > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", > line 214, in execute > method(env) > File > "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/service_check.py", > line 92, in service_check > logoutput = True > File > "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line > 148, in __init__ > self.env.run() > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 152, in run > self.run_action(resource, action) > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 118, in run_action > provider_action() > File > "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", > line 269, in action_run > raise ex > Fail: Execution of ' /var/lib/ambari-agent/data/tmp/hbaseSmokeVerify.sh > /etc/hbase/conf ida8c06740_date380815 > /usr/hdp/current/hbase-client/bin/hbase' returned 1. SLF4J: Class path > contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/hdp/2.3.0.0-1865/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > HBase Shell; enter 'help<RETURN>' for list of supported commands. > Type "exit<RETURN>" to leave the HBase Shell > Version 1.1.0.2.3.0.0-1865, r723b969516d89b62f68a0a19e227757078e268b2, > Mon May 4 20:05:10 EDT 2015 > > scan 'ambarismoketest' > ROW COLUMN+CELL > 0 row(s) in 0.4130 seconds > > Looking for ida8c06740_date380815 > > > Diffs > ----- > > > ambari-server/src/main/resources/common-services/HBASE/0.96.0.2.0/package/scripts/service_check.py > b774f19 > > ambari-server/src/test/python/stacks/2.0.6/HBASE/test_hbase_service_check.py > 368aa58 > > Diff: https://reviews.apache.org/r/34172/diff/ > > > Testing > ------- > > mvn clean test > > > Thanks, > > Andrew Onischuk > >
