Re: Authenticating to Kerberos enabled Hadoop cluster using Java

2015-11-02 Thread Subroto Sanyal
Hi Chhaya,

Few configuration you need to set:

hadoop.security.authentication=kerberos

hadoop.security.authorization=true

dfs.namenode.kerberos.principal=hdfs/had...@hadoop.com

fs.defaultFS=hdfs://host:port

Further you need to
use org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(String,
String) as suggested in one of the trailing mail.


Cheers,

Subroto Sanyal

On Mon, Nov 2, 2015 at 3:13 PM, Vishwakarma, Chhaya <
chhaya.vishwaka...@thinkbiganalytics.com> wrote:

> Code is successfully authenticating to Kerberos but when I try to run any
> hdfs command I get error as "Failed to specify server's kerberos principal
> name"
>
> Can somebody please assist me on this?
>
> Sent from my android device.
>
>
> -Original Message-
> From: andreina j 
> To: "user@hadoop.apache.org" 
> Sent: Mon, 02 Nov 2015 4:57 pm
> Subject: RE: Authenticating to Kerberos enabled Hadoop cluster using Java
>
> Hi Chhaya,
>
>
>
> Please find below sample code .
>
>
>
>   System.setProperty("java.security.krb5.conf", "D:
> \\data\\Desktop\\cluster-test\\krb5.conf");
>
> // Login using keytab if have access to keytab. else
> UserGroupInformation.loginUserFromKeytab("hdfs/had...@hadoop.com",
>"  D:\\data\\Desktop\\cluster-test\\conf\\hdfs.keytab");
>
>
>
> Note: Above 2 lines should be at the beginning  in your application.
>
>
>
> Regards
>
> Andreina J
>
>
>
> *From:* Vishwakarma, Chhaya [mailto:
> chhaya.vishwaka...@thinkbiganalytics.com]
> *Sent:* 02 November 2015 PM 04:20
> *To:* user@hadoop.apache.org
> *Subject:* RE: Authenticating to Kerberos enabled Hadoop cluster using
> Java
>
>
>
> Thanks Niranjan It would be great if you can share a sample code if any?
>
>
>
> *From:* Niranjan Subramanian [mailto:niran...@webaction.com
> ]
> *Sent:* 02 November 2015 16:18
> *To:* user@hadoop.apache.org
> *Subject:* Re: Authenticating to Kerberos enabled Hadoop cluster using
> Java
>
>
>
> Hi Chhaya,
>
>
>
> You can use the UserGroupInformation class from org.apache.hadoop.security
> package.
>
>
>
> Specifically following 2 methods of that class
>
>
>
> UserGroupInformation.setConfiguration(hdfsConfiguration);
>
> UserGroupInformation.loginUserFromKeytab(principal, keytabPath);
>
>
>
> Regards,
>
> Niranjan
>
>
>
> On 02-Nov-2015, at 4:15 pm, Vishwakarma, Chhaya <
> chhaya.vishwaka...@thinkbiganalytics.com> wrote:
>
>
>
> I have Kerberos enabled Hadoop cluster, I need to perform HDFS operations
> using JAVA code.
>
> I have keytab file and username can someone please suggest how can I
> autheticate to Kerberos using JAVA code?
>
> Regards,
>
> Chhaya
>
>
>


Re: Secured Hadoop Cluster with Java 1.7.0_80 - Failure re-login with keytabs

2015-10-08 Thread Subroto Sanyal
Hi Matthew

You can check if you are hitting into:
https://issues.apache.org/jira/browse/HADOOP-10786

Cheers,
Subroto Sanyal

On Thu, Oct 8, 2015 at 5:11 PM, Matthew Bruce  wrote:

> Hello Hadoop Users,
>
>
>
> We have been doing java upgrade testing in one of our Hadoop lab
> environments and have run into issues using Oracle java 1.7.0_80 with a
> secured Hadoop cluster (Since the initial issue, we’ve verified this in a
> second environment too).  Basically all the Hadoop components once the
> initial Kerberos ticket has expired fail to re-login using their keytab
> files (note that I’ve found nothing in the logs indicating why this
> happens, it seems like the components don’t even attempt to re-login).
>  Moreover I’ve verified that this behavior does not occur with java
> 1.7.0_79.
>
>
>
> The only thing I’ve been able to find that might be realted/cause this is
> this blurb in the u80 release notes:
>
> *Issues with Third party's JCE Providers*
>
> The fix for JDK-8023069 updated both the SunJSSE and and SunJCE providers,
> including some internal interfaces.
>
> Some third party JCE providers (such as RSA JSAFE) are using some sun.*
> internal interfaces, and therefore will not work with the updated SunJSSE
> provider. Such providers will need to be updated in order for them to work
> with the updated SunJSSE provider.
>
>
>
> If you have been impacted by this issue, contact your JCE vendor for an
> update.
>
>
>
> See 8133502 <http://bugs.java.com/view_bug.do?bug_id=8133502>.
>
>
>
> I’m wondering, has anyone else run into issues running a secured Hadoop
> cluster with java 1.7.0_80?
>
>
>
> Thanks,
>
> Matthew Bruce
>
> mbr...@blackberry.com
>
>
>


VCores used is greater than Total VCores

2015-07-03 Thread Subroto Sanyal
Hello,

While running lot of Jobs on a YARN clusters, I noticed the following which
looked little unusual to me:
[image: Inline image 1]
VCored Used > VCores Total

The hadoop Version used here is: 2.6.0.2.2.0.0-2041

Is it bug in YARN (scheduler/UI) ?

Cheers,
Subroto Sanyal


Re: Unsubscribe

2014-08-19 Thread Subroto Sanyal
Send a mail to user-unsubscr...@hadoop.apache.org

Cheers,
Subroto Sanyal
On 19 Aug 2014, at 13:40, Vasantha Kumar Kannaki Kaliappan 
 wrote:

> unsubscribe



signature.asc
Description: Message signed with OpenPGP using GPGMail


Re: MR AppMaster unable to load native libs

2014-08-13 Thread Subroto Sanyal
hi Susheel,

Thanks for your input. I did build the libs for 64 bit but, still the problem 
was there.
Though the problem is resolved now.
I had to configure the property: yarn.app.mapreduce.am.env

Cheers,
Subroto Sanyal
On 13 Aug 2014, at 10:39, Susheel Kumar Gadalay  wrote:

> This message I have also got when running in 2.4.1
> 
> I have found the native libraries in $HADOOP_HOME/lib/native are 32
> bit not 64 bit.
> 
> Recompile once again and build 64 bit shared objects, but it is a
> lengthy exercise.
> 
> On 8/13/14, Subroto Sanyal  wrote:
>> Hi,
>> 
>> I am running a single node hadoop cluster 2.4.1.
>> When I submit a MR job it logs a warning:
>> 2014-08-12 21:38:22,173 WARN [main] org.apache.hadoop.util.NativeCodeLoader:
>> Unable to load native-hadoop library for your platform... using builtin-java
>> classes where applicable
>> 
>> The problem doesn’t comes up when starting any hadoop daemons.
>> Do I need to pass any specific configuration so that the child jvm is able
>> to pick up the native lib folder?
>> 
>> Cheers,
>> Subroto Sanyal
>> 



signature.asc
Description: Message signed with OpenPGP using GPGMail


MR AppMaster unable to load native libs

2014-08-12 Thread Subroto Sanyal
Hi,

I am running a single node hadoop cluster 2.4.1.
When I submit a MR job it logs a warning:
2014-08-12 21:38:22,173 WARN [main] org.apache.hadoop.util.NativeCodeLoader: 
Unable to load native-hadoop library for your platform... using builtin-java 
classes where applicable

The problem doesn’t comes up when starting any hadoop daemons. 
Do I need to pass any specific configuration so that the child jvm is able to 
pick up the native lib folder?

Cheers,
Subroto Sanyal


signature.asc
Description: Message signed with OpenPGP using GPGMail