AW: Mapreduce Job fails if one Node is offline?

2016-10-24 Thread Mike Wenzel
Hey alex,

first of all thanks for your reply.

>> Dfs replication has nothing to do with Yarn or MapReduce, its HDFS. 
>> Replication defines how many replicas are existing in a cluster.
Okay, I got this. I mixed wrong things up here.

>> When you kill the NM and you don’t have yarn.nodemanager.recovery.enabled 
>> (https://hadoop.apache.org/docs/r2.7.2/hadoop-yarn/hadoop-yarn-site/NodeManagerRestart.html)
>>  set, the containers running on that node are getting lost or killed, but 
>> your job will likely run and wait until that NM comes back.
As far as I understood this, NodeManager restart is for keeping containers 
until the NodeManager is online again, but I don’t see why this should have 
something to do with my problem.

My problem is:
If one of all my Nodes is offline, my MapReduce jobs don’t finish successfully 
anymore. I don’t want to wait until it is online again. I want my jobs to run 
and finish on all my other healthy notes instead.

Same behavior after enabling and configuring NM restart.
Attempts error message is:
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed 
with code 111 at
…
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed 
with code 2 at
…
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed 
with code 111 at
…
Task KILL is received. Killing attempt!

If my problem is not clear please let me know. If I shall post some log’s just 
let me know which ones you’re looking for.

Thanks in advice,
And best regards.
-- Mike

Von: wget.n...@gmail.com [mailto:wget.n...@gmail.com]
Gesendet: Freitag, 21. Oktober 2016 11:53
An: Mike Wenzel ; user@hadoop.apache.org
Betreff: RE: Mapreduce Job fails if one Node is offline?

Hey Mike,

Dfs replication has nothing to do with Yarn or MapReduce, its HDFS. Replication 
defines how many replicas are existing in a cluster.
When you kill the NM and you don’t have yarn.nodemanager.recovery.enabled 
(https://hadoop.apache.org/docs/r2.7.2/hadoop-yarn/hadoop-yarn-site/NodeManagerRestart.html)
 set, the containers running on that node are getting lost or killed, but your 
job will likely run and wait until that NM comes back.

http://hortonworks.com/blog/resilience-of-yarn-applications-across-nodemanager-restarts/
http://www.cloudera.com/documentation/enterprise/5-4-x/topics/admin_ha_yarn_work_preserving_recovery.html

--alex

--
B: mapredit.blogspot.com

From: Mike Wenzel
Sent: Friday, October 21, 2016 11:29 AM
To: user@hadoop.apache.org
Subject: Mapreduce Job fails if one Node is offline?

I got a small cluster for testing and learning hadoop:

Node1 - Namenode + ResourceManager + JobhistoryServer
Node2 - SecondaryNamenode
Node3 - Datanode + NodeManager
Node4 - Datanode + NodeManager
Node5 - Datanode + NodeManager

My dfs.replication is set to 2.

When I kill the Datanode and Nodemanager process on Node5  I expect Hadoop 
still to run and finish my mapreduce jobs successfully.
In reality the job fails because he tries to transfer blocks to Node5 which is 
offline. Replication is set to 2, so I expect him to see that Node5 is offline 
and only take the other two Nodes to work with.

Can someone please explain to me how Hadoop should work in this case?
If my expectation of Hadoop is correct, and someone would try to help me out, I 
can add logs and configuration.

Best Regards,
Mike.



unsubscribe

2016-10-24 Thread Bourre, Marc
unsubscribe


Re: Encryption type AES256 CTS mode with HMAC SHA1-96 is notsupported/enabled

2016-10-24 Thread kumar r
Hi,

If i installed policy files then it shows,



*GSSException: Failure unspecified at GSS-API level (Mechanism level:
Specified version of key is not available (44))*


But without installing Policy files itself, it works fine with local
Windows Active Directory.

Thanks,



On Mon, Oct 24, 2016 at 12:28 PM,  wrote:

> Looks like the strong encryption policy file for Java (Oracle) isn’t
> installed. Or you don’t have a valid Kerberos ticket in your cache (klist).
>
>
>
> --
> B: mapredit.blogspot.com
>
>
>
> *From: *kumar r 
> *Sent: *Monday, October 24, 2016 8:49 AM
> *To: *user@hadoop.apache.org
> *Subject: *Encryption type AES256 CTS mode with HMAC SHA1-96 is
> notsupported/enabled
>
>
>
> Hi,
>
> I am trying to configure hadoop pseudo node secure cluster (to ensure
> proper working) in Azure using Azure Domain Service.
>
> OS - Windows Server 2012 R2 Datacenter
>
> Hadoop Version - 2.7.2
>
> I can able to run
>
> *hadoop fs -ls /*
>
> Example MapReduce job works fine
> *yarn jar
> %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar pi 16
> 1*
>
>
>
> But when i run,
>
> *hdfs fsck /*
>
> it gives,
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Connecting to namenode via https://node1:50470/fsck?ugi=Kumar=%2F
> Exception in thread "main"
> java.io.IOException:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> Authentication failed, status: 403, message: GSSException: No valid
> credentials provided (Mechanism level: Failed to find any Kerberos
> credentails)at
> org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335)at
> org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73)at
> org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152)at
> org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149)at
> java.security.AccessController.doPrivileged(Native Method)at
> javax.security.auth.Subject.doAs(Subject.java:415)at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148)at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)at
> org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)Caused by:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> Authentication failed, status: 403, message: GSSException: No valid
> credentials provided (Mechanism level: Failed to find any Kerberos
> credentails)at
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
> at
> org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
> at
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
> at
> org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161)
> at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333)... 10
> more*
>
> When i access namenode web ui, it shows
>
>
>
>
> *GSSException: Failure unspecified at GSS-API level (Mechanism level: 
> Encryption type AES256 CTS mode with HMAC SHA1-96 is not 
> supported/enabled)**[image: Inline image 1]*
>
> Someone help me to resolve this error and get it work successfully.
>
>
>


RE: unsubscribe

2016-10-24 Thread Brahma Reddy Battula

Drop a mail to user-unsubscr...@hadoop.apache.org





--Brahma Reddy Battula

From: Chen Qiming [mailto:qimin...@usc.edu]
Sent: 24 October 2016 14:53
To: user@hadoop.apache.org
Subject: unsubscribe

unsubscribe


RE: Encryption type AES256 CTS mode with HMAC SHA1-96 is notsupported/enabled

2016-10-24 Thread wget.null
Looks like the strong encryption policy file for Java (Oracle) isn’t installed. 
Or you don’t have a valid Kerberos ticket in your cache (klist).

--
B: mapredit.blogspot.com

From: kumar r
Sent: Monday, October 24, 2016 8:49 AM
To: user@hadoop.apache.org
Subject: Encryption type AES256 CTS mode with HMAC SHA1-96 is 
notsupported/enabled

Hi,
I am trying to configure hadoop pseudo node secure cluster (to ensure proper 
working) in Azure using Azure Domain Service. 
OS - Windows Server 2012 R2 Datacenter
Hadoop Version - 2.7.2
I can able to run 
hadoop fs -ls /
Example MapReduce job works fine
yarn jar %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar 
pi 16 1

But when i run,
hdfs fsck /
it gives,

Connecting to namenode via https://node1:50470/fsck?ugi=Kumar=%2F
Exception in thread "main" java.io.IOException: 
org.apache.hadoop.security.authentication.client.AuthenticationException: 
Authentication failed, status: 403, message: GSSException: No valid credentials 
provided (Mechanism level: Failed to find any Kerberos credentails)
    at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335)
    at org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73)
    at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152)
    at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)
Caused by: 
org.apache.hadoop.security.authentication.client.AuthenticationException: 
Authentication failed, status: 403, message: GSSException: No valid credentials 
provided (Mechanism level: Failed to find any Kerberos credentails)
    at 
org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
    at 
org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
    at 
org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
    at 
org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
    at 
org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161)
    at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333)
    ... 10 more

When i access namenode web ui, it shows


GSSException: Failure unspecified at GSS-API level (Mechanism level: Encryption 
type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)


Someone help me to resolve this error and get it work successfully. 


-
To unsubscribe, e-mail: user-unsubscr...@hadoop.apache.org
For additional commands, e-mail: user-h...@hadoop.apache.org

unsubscribe

2016-10-24 Thread Chen Qiming
unsubscribe

Re: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled

2016-10-24 Thread Chen Qiming
unsubscribe
> On Oct 23, 2016, at 11:49 PM, kumar r  wrote:
> 
> Hi,
> 
> I am trying to configure hadoop pseudo node secure cluster (to ensure proper 
> working) in Azure using Azure Domain Service. 
> 
> OS - Windows Server 2012 R2 Datacenter
> Hadoop Version - 2.7.2
> 
> I can able to run 
> hadoop fs -ls /
> 
> Example MapReduce job works fine
> yarn jar %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar 
> pi 16 1
> 
> But when i run,
> hdfs fsck /
> 
> it gives,
> 
> Connecting to namenode via https://node1:50470/fsck?ugi=Kumar=%2F 
> 
> Exception in thread "main" java.io.IOException: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> Authentication failed, status: 403, message: GSSException: No valid 
> credentials provided (Mechanism level: Failed to find any Kerberos 
> credentails)
> at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335)
> at org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73)
> at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152)
> at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)
> Caused by: 
> org.apache.hadoop.security.authentication.client.AuthenticationException: 
> Authentication failed, status: 403, message: GSSException: No valid 
> credentials provided (Mechanism level: Failed to find any Kerberos 
> credentails)
> at 
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
> at 
> org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
> at 
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
> at 
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
> at 
> org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161)
> at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333)
> ... 10 more
> 
> 
> When i access namenode web ui, it shows
> 
> GSSException: Failure unspecified at GSS-API level (Mechanism level: 
> Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)
> 
> 
> 
> Someone help me to resolve this error and get it work successfully. 
> 



Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled

2016-10-24 Thread kumar r
Hi,

I am trying to configure hadoop pseudo node secure cluster (to ensure
proper working) in Azure using Azure Domain Service.

OS - Windows Server 2012 R2 Datacenter
Hadoop Version - 2.7.2

I can able to run
*hadoop fs -ls /*

Example MapReduce job works fine
*yarn jar
%HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar pi 16
1*

But when i run,
*hdfs fsck /*

it gives,





















*Connecting to namenode via https://node1:50470/fsck?ugi=Kumar=%2F
Exception in thread "main"
java.io.IOException:
org.apache.hadoop.security.authentication.client.AuthenticationException:
Authentication failed, status: 403, message: GSSException: No valid
credentials provided (Mechanism level: Failed to find any Kerberos
credentails)at
org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335)at
org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73)at
org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152)at
org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149)at
java.security.AccessController.doPrivileged(Native Method)at
javax.security.auth.Subject.doAs(Subject.java:415)at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148)at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)at
org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)Caused by:
org.apache.hadoop.security.authentication.client.AuthenticationException:
Authentication failed, status: 403, message: GSSException: No valid
credentials provided (Mechanism level: Failed to find any Kerberos
credentails)at
org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
at
org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
at
org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
at
org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
at
org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161)
at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333)... 10
more*


When i access namenode web ui, it shows





*GSSException: Failure unspecified at GSS-API level (Mechanism level:
Encryption type AES256 CTS mode with HMAC SHA1-96 is not
supported/enabled)[image: Inline image 1]*

Someone help me to resolve this error and get it work successfully.