Yes, changing the public DNS to local hostname/IP works. I would like to know if this can be configured to local hostname(FQDN) during installation itself, so that it need not be changed manually on every restart of the AWS server or whenever the public IP is changed.
Thanks, Satya. From: Benoit Perroud [mailto:ben...@noisette.ch] Sent: 09 February 2018 15:32 To: user@ambari.apache.org Subject: Re: Hive Error on restart when public IP is changed The ip listed in the exception is the instance private ip. I would change > hadoop.proxyuser.hive.hosts: > ec2-54-197-36-23.compute-1.amazonaws.com<http://ec2-54-197-36-23.compute-1.amazonaws.com/> to > hadoop.proxyuser.hive.hosts: 172.31.55.219 If this still doesn’t work, remove the IP and put * instead. Small warning here, I would not open Hive to the whole world and rely only on host filtering thinking it’s secure. On 09 Feb 2018, at 09:59, Satyanarayana Jampa <sja...@innominds.com<mailto:sja...@innominds.com>> wrote: Hi, The below error is observed after restarting the single node AWS machine. Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 172.31.55.219 at org.apache.hadoop.ipc.Client.call(Client.java:1427) at org.apache.hadoop.ipc.Client.call(Client.java:1358) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) Scenario: 1. Install the HDP on a single node AWS box. 2. We can see the below configuration after installation: a. services->HDFS->configs->advanced->custom core-site i. hadoop.proxyuser.hive.hosts: ec2-54-197-36-23.compute-1.amazonaws.com<http://ec2-54-197-36-23.compute-1.amazonaws.com/> ii. hadoop.proxyuser.hive.groups: * 3. Once we restart the AWS machine, the public IP of the machine changes and as such the public DNS name which was picked up automatically during installation for “hadoop.proxyuser.hive.hosts” property becomes invalid and hence the error. Can someone please let me know how to overcome this situation. Thanks, Satya.