[ https://issues.apache.org/jira/browse/AMBARI-18393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15492273#comment-15492273 ]
Hudson commented on AMBARI-18393: --------------------------------- SUCCESS: Integrated in Jenkins build Ambari-branch-2.5 #37 (See [https://builds.apache.org/job/Ambari-branch-2.5/37/]) AMBARI-18393. Hive Server Interactive (HSI) fails to start with (sshridhar: [http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=7594857e40da29baa3ff80832b04b57e4ffcf90d]) * (edit) ambari-server/src/test/python/stacks/2.5/HIVE/test_hive_server_int.py * (edit) ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive_interactive.py > Hive Server Interactive (HSI) fails to start with 'Permission denied' for > User Hive, if HSI starts before HS2. > -------------------------------------------------------------------------------------------------------------- > > Key: AMBARI-18393 > URL: https://issues.apache.org/jira/browse/AMBARI-18393 > Project: Ambari > Issue Type: Bug > Components: ambari-server > Affects Versions: 2.4.0 > Reporter: Swapan Shridhar > Assignee: Swapan Shridhar > Fix For: 2.5.0 > > > - Install Cluster using Blueprint including HiveServerInteractive > - Start services fail at Hive interactive start, with below error: > {code} > Traceback (most recent call last): > File > "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", > line 535, in <module> > HiveServerInteractive().execute() > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", > line 280, in execute > method(env) > File > "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", > line 115, in start > self.setup_security() > File > "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py", > line 335, in setup_security > Execute(slider_keytab_install_cmd, user=params.hive_user) > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", > line 155, in __init__ > self.env.run() > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 160, in run > self.run_action(resource, action) > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 124, in run_action > provider_action() > File > "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", > line 273, in action_run > tries=self.resource.tries, try_sleep=self.resource.try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 71, in inner > result = function(command, **kwargs) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 93, in checked_call > tries=tries, try_sleep=try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 141, in _call_wrapper > result = _call(command, **kwargs_copy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 294, in _call > raise Fail(err_msg) > resource_management.core.exceptions.Fail: Execution of 'slider install-keytab > --keytab /etc/security/keytabs/hive.llap.zk.sm.keytab --folder hive > --overwrite' returned 56. 2016-08-15 23:47:57,518 [main] INFO > tools.SliderUtils - JVM initialized into secure mode with kerberos realm > HWQE.HORTONWORKS.COM > 2016-08-15 23:47:59,108 [main] INFO impl.TimelineClientImpl - Timeline > service address: > http://nat-s11-4-lkws-stackdeploy-3.openstacklocal:8188/ws/v1/timeline/ > 2016-08-15 23:48:01,584 [main] WARN shortcircuit.DomainSocketFactory - The > short-circuit local reads feature cannot be used because libhadoop cannot be > loaded. > 2016-08-15 23:48:01,633 [main] INFO client.RMProxy - Connecting to > ResourceManager at > nat-s11-4-lkws-stackdeploy-5.openstacklocal/172.22.71.181:8050 > 2016-08-15 23:48:01,983 [main] INFO client.AHSProxy - Connecting to > Application History server at > nat-s11-4-lkws-stackdeploy-3.openstacklocal/172.22.71.168:10200 > 2016-08-15 23:48:03,297 [main] WARN client.SliderClient - The > 'install-keytab' option has been deprecated. Please use 'keytab --install'. > 2016-08-15 23:48:03,440 [main] WARN retry.RetryInvocationHandler - Exception > while invoking ClientNamenodeProtocolTranslatorPB.mkdirs over null. Not > retrying because try once and fail. > org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): > Permission denied: user=hive, access=WRITE, > inode="/user/hive/.slider/keytabs/hive":hdfs:hdfs:drwxr-xr-x > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) > at > org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827) > at > org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1811) > at > org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1794) > at > org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4011) > at > org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1102) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:630) > at > org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)