Hi Madan,

Please find the below asked ranger details:

Q) Under hadoop lib directory, list of files/links whose name start with
“ranger” or “xa"

cd /usr/hdp/current/hadoop-hdfs-namenode/lib/

lrwxrwxrwx 1 root root      58 Mar 11 02:53 eclipselink-2.5.2-M1.jar ->
/usr/local/ranger-hdfs-plugin/lib/eclipselink-2.5.2-M1.jar
lrwxrwxrwx 1 root root      48 Mar 11 02:53 gson-2.2.4.jar ->
/usr/local/ranger-hdfs-plugin/lib/gson-2.2.4.jar
lrwxrwxrwx 1 root root      61 Mar 11 02:53 javax.persistence-2.1.0.jar ->
/usr/local/ranger-hdfs-plugin/lib/javax.persistence-2.1.0.jar
lrwxrwxrwx 1 root root      62 Mar 11 02:53 ranger-hdfs-plugin-0.4.0.jar ->
/usr/local/ranger-hdfs-plugin/lib/ranger-hdfs-plugin-0.4.0.jar
lrwxrwxrwx 1 root root      64 Mar 11 02:53 ranger-plugins-audit-0.4.0.jar
-> /usr/local/ranger-hdfs-plugin/lib/ranger-plugins-audit-0.4.0.jar
lrwxrwxrwx 1 root root      65 Mar 11 02:53 ranger-plugins-common-0.4.0.jar
-> /usr/local/ranger-hdfs-plugin/lib/ranger-plugins-common-0.4.0.jar
lrwxrwxrwx 1 root root      63 Mar 11 02:53 ranger-plugins-cred-0.4.0.jar
-> /usr/local/ranger-hdfs-plugin/lib/ranger-plugins-cred-0.4.0.jar
lrwxrwxrwx 1 root root      40 Mar 11 02:53 mysql-connector-java.jar ->
/usr/share/java/mysql-connector-java.jar

Q) Under hadoop conf directory, list of files whose name start with
“ranger” or “xa"

 ls -lrt /etc/hadoop/conf/

-rwxr-xr-x 1 root   root    1612 Mar 11 02:53 set-hdfs-plugin-env.sh
-rw-r--r-- 1 hdfs   hdfs      69 Mar 11 02:53 ranger-security.xml
-rwxr--r-- 1 hdfs   hdfs    4950 Mar 11 02:53 ranger-hdfs-audit.xml
-rwxr--r-- 1 hdfs   hdfs    3295 Mar 11 02:53 ranger-hdfs-security.xml
-rwxr--r-- 1 hdfs   hdfs    2267 Mar 11 02:53 ranger-policymgr-ssl.xml

Q) Namenode log file(s)

NameNode Log file attached.


Regards,
Shaik

On 11 March 2015 at 07:02, Madhan Neethiraj <[email protected]>
wrote:

>  Shaik,
>
>  To help further, can you please provide the following details?
>
>    1. Under hadoop lib directory, list of files/links whose name start
>    with “ranger” or “xa"
>    2. Under hadoop conf directory, list of files whose name start with
>    “ranger” or “xa"
>    3. Namenode log file(s)
>
>
>  Madhan
>
>   From: Shaik M <[email protected]>
> Reply-To: "[email protected]" <
> [email protected]>
> Date: Tuesday, March 10, 2015 at 3:53 PM
> To: "[email protected]" <[email protected]>
> Subject: Re: Ranger HDFS Plugin issue
>
>   Hi,
>
>  please let me know if you have any other solution for HDFS plugin issue.
>
>  Regards,
> Shaik
>
> On 10 March 2015 at 10:51, Shaik M <[email protected]> wrote:
>
>> Hi Madhan,
>>
>>  I have reinstalled HDFS plugin, but still i am facing same issue.
>> please let me know if you have any other solution.
>>
>>  Thanks,
>> Shaik
>>
>> On 10 March 2015 at 00:01, Madhan Neethiraj <[email protected]>
>> wrote:
>>
>>>  Shaik,
>>>
>>>  HDFS plugin built from ‘master’ branch uses different configuration
>>> files than the one built from 0.4 version. From the error details, it looks
>>> like HDFS plugin from the ‘master’ branch was installed in an environment
>>> that had earlier version installed.
>>>
>>>  To fix this issue, please try the following:
>>>
>>>    - from your Hadoop installation lib directory (for example:
>>>    /usr/local/hadoop/lib), remove Ranger libraries or symbolic links –
>>>    filename of pattern ranger-*.jar
>>>    - reinstall the plugin
>>>
>>> Hope this helps.
>>>
>>>  Madhan
>>>
>>>    From: Shaik M <[email protected]>
>>> Reply-To: "[email protected]" <
>>> [email protected]>
>>> Date: Monday, March 9, 2015 at 3:45 AM
>>>
>>> To: "[email protected]" <[email protected]
>>> >
>>> Subject: Re: Ranger Hive Plugin issue
>>>
>>>   *HDFS Plugin Issue*
>>>
>>>  Hi Bosco,
>>>
>>>  I am doing the HDFS Plugin-0.4 installation as per the Apache Ranger
>>> documentation. in that installation procedure it is pointing to
>>> *xasecure-audit.xml* file. But, after enabling the plugin it is
>>> generated following conf files in /etc/hadoop/conf *ranger-hdfs-audit.xml, 
>>> ranger-hdfs-security.xml
>>> & ranger-policymgr-ssl.xml.*
>>>
>>>  After restating NameNode, agents are not registered with Ranger-Admin.
>>> I have gone through with NameNode logs and i found following error log:
>>>
>>>  015-03-09 10:32:19,746 FATAL conf.Configuration
>>> (Configuration.java:loadResource(2512)) - error parsing conf
>>> file:/etc/hadoop/conf.empty/xasecure-audit.xml
>>> java.io.FileNotFoundException: /etc/hadoop/conf.empty/xasecure-audit.xml
>>> (No such file or directory)
>>>         at java.io.FileInputStream.open(Native Method)
>>>         at java.io.FileInputStream.<init>(FileInputStream.java:146)
>>>         at java.io.FileInputStream.<init>(FileInputStream.java:101)
>>>         at
>>> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
>>>         at
>>> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
>>>         at java.net.URL.openStream(URL.java:1037)
>>>         at
>>> org.apache.hadoop.conf.Configuration.parse(Configuration.java:2342)
>>>         at
>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2410)
>>>         at
>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2376)
>>>         at
>>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2283)
>>>         at
>>> org.apache.hadoop.conf.Configuration.get(Configuration.java:1110)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.XaSecureFSPermissionChecker.<clinit>(XaSecureFSPermissionChecker.java:57)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6515)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4143)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:838)
>>>         at
>>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:821)
>>>         at
>>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>>>         at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
>>>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
>>>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
>>>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>>         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
>>> 2015-03-09 10:32:19,748 WARN  ipc.Server (Server.java:run(2058)) - IPC
>>> Server handler 3 on 8020, call
>>> org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo from
>>> 10.193.153.223:56168 Call#2579 Retry#10
>>> java.lang.ExceptionInInitializerError
>>>
>>>
>>>  Please let me know the solution for this.
>>>
>>>  Regards,
>>> Shaik
>>>
>>> On 9 March 2015 at 18:01, Shaik M <[email protected]> wrote:
>>>
>>>> Sorry, Bosco...
>>>>
>>>>  Some DB configuration issue...after fixing that issue has been
>>>> resolved. Now all access audits are showing in Audit>Access tab.
>>>>
>>>>  Thank you for your help!
>>>>
>>>>  -Shaik
>>>>
>>>> On 9 March 2015 at 16:51, Shaik M <[email protected]> wrote:
>>>>
>>>>> Hi Bosco,
>>>>>
>>>>>  Thank you again..:)
>>>>>
>>>>>  I have successfully deployed Hive plugin as per Ranger
>>>>> documentation. All policies are working fine!!!
>>>>>
>>>>>  But, schema access audits are not showing in Audit>Access tab.
>>>>>
>>>>>  Please let me know if any configuration changes are required for
>>>>> this...
>>>>>
>>>>>  Regards,
>>>>> Shaik
>>>>>
>>>>> On 9 March 2015 at 14:18, Don Bosco Durai <[email protected]> wrote:
>>>>>
>>>>>>  Hi Shaik
>>>>>>
>>>>>>  For HDFS and Hive plugins, you shouldn’t be needing to do any
>>>>>> changes on the Ambari version 1.7 side.  For HDFS, you might want to 
>>>>>> check
>>>>>> the hdf.perm.enable property in Ambari and it is turned off, then you 
>>>>>> have
>>>>>> to enable it. For Hbase you have to do couple of config changes on the
>>>>>> Apache Ambari side.
>>>>>>
>>>>>>  Thanks
>>>>>>
>>>>>>  Bosco
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>   From: Shaik M <[email protected]>
>>>>>> Reply-To: "[email protected]" <
>>>>>> [email protected]>
>>>>>> Date: Sunday, March 8, 2015 at 8:42 PM
>>>>>> To: "[email protected]" <
>>>>>> [email protected]>
>>>>>> Subject: Re: Ranger Hive Plugin issue
>>>>>>
>>>>>>   Hi Bosco,
>>>>>>
>>>>>>  Thank you so much for your suggestions.
>>>>>>
>>>>>>  I am using HDP2.2 and using "Hive 0.14.0.2.2.0.0". I have reverted
>>>>>> all changes based on XASecure 3.5 documentation.
>>>>>>
>>>>>>  I have a question about: Is it required any changes for Apache
>>>>>> Ranger 0.4 on HDP2.2+Ambari  other than mentioned in Ranger 0.4
>>>>>> installation procedure?
>>>>>>
>>>>>>  If any changes are required kindly let me know.
>>>>>>
>>>>>>  Regards,
>>>>>> Shaik M
>>>>>>
>>>>>>
>>>>>> On 9 March 2015 at 13:20, Don Bosco Durai <[email protected]> wrote:
>>>>>>
>>>>>>>  Hi Shaik
>>>>>>>
>>>>>>>  There seems to be multiple issues here. The official released
>>>>>>> version of Apache Ranger is 0.4 and the instruction to build and 
>>>>>>> install is
>>>>>>> available at:
>>>>>>>
>>>>>>>
>>>>>>> https://cwiki.apache.org/confluence/display/RANGER/Ranger+Installation+Guide
>>>>>>>
>>>>>>>  Please note, Apache Ranger  0.4 release is compatible with Hive
>>>>>>> 0.14. So make sure you using the corresponding versions. The install and
>>>>>>> setup script assumes the folders as per Apache Hive install.
>>>>>>>
>>>>>>>  Also note, you should follow the instructions as per
>>>>>>> documentation. Updating the install.properties and running
>>>>>>> enable-hive-plugin.sh helps in updating the appropriate configuration 
>>>>>>> file.
>>>>>>> It seems you have been trying to update it manually, which is not
>>>>>>> advisable, unless your installation folders for Hive or other Hadoop
>>>>>>> components are customized.
>>>>>>>
>>>>>>>  Another note, the error you are getting is if you are trying to
>>>>>>> configure Hive using the XASecure 3.5 version documentation. Note, XA
>>>>>>> Secure 3.5 is the erstwhile version of Apache Ranger 0.4 and it is
>>>>>>> compatible up to Hive 0.13 only.
>>>>>>>
>>>>>>>  Thanks
>>>>>>>
>>>>>>>  Bosco
>>>>>>>
>>>>>>>
>>>>>>>   From: Hadoop Solutions <[email protected]>
>>>>>>> Reply-To: "[email protected]" <
>>>>>>> [email protected]>
>>>>>>> Date: Sunday, March 8, 2015 at 7:03 PM
>>>>>>> To: "[email protected]" <
>>>>>>> [email protected]>
>>>>>>> Subject: Ranger Hive Plugin issue
>>>>>>>
>>>>>>>   Hi,
>>>>>>>
>>>>>>>
>>>>>>>  I have configured hive ranger plugin with the help of HDP doc.
>>>>>>> After configuring Ranger-Hive plugin, tries to run query from HUE. But 
>>>>>>> I am
>>>>>>> getting following exception, unable to run any query.
>>>>>>>
>>>>>>>  *Error occurred executing hive query: Error while compiling
>>>>>>> statement: FAILED: ClassNotFoundException
>>>>>>> com.xasecure.authorization.hive.hooks.XaSecureSemanticAnalyzerHook*
>>>>>>>
>>>>>>>  Please help me to resolve the issue.
>>>>>>>
>>>>>>>
>>>>>>>  Thanks,
>>>>>>> Shaik
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Attachment: nn.log
Description: Binary data

Reply via email to