Re: QueueMetrics.AppsKilled/Failed metrics and failure reasons

2015-02-03 Thread Suma Shivaprasad
Using hadoop 2.4.0. #of Applications running on average is small ~ 40 -60. The metrics in Ganglia shows around around 10-30 apps killed every 5 mins which is very high wrt to the apps running at any given time(40-60). The RM logs though show 0 failed apps in audit logs during that hour. The RM UI

Can I configure multiple M/Rs and normal processes to one workflow?

2015-02-03 Thread 임정택
Hello all. We're periodically scan HBase tables to aggregate statistic information, and store it to MySQL. We have 3 kinds of CP (kind of data source), each has one Channel and one Article table. (Channel : Article is 1:N relation.) All CPs table schema are different a bit, so in order to

RE: QueueMetrics.AppsKilled/Failed metrics and failure reasons

2015-02-03 Thread Rohith Sharma K S
Hi Could you give more information, which version of hadoop are you using? QueueMetrics.AppsKilled/Failed metrics shows much higher nos i.e ~100. However RMAuditLogger shows 1 or 2 Apps as Killed/Failed in the logs. May be I suspect that Logs might be rolled out. Does more applications are

RE: QueueMetrics.AppsKilled/Failed metrics and failure reasons

2015-02-03 Thread Rohith Sharma K S
There are several ways to confirm from YARN that total number of Killed/Failed applications in cluster 1. Get from RM web UI lists OR 2. From admin try using this to get numbers of failed and killed applications: ./yarn application -list -appStates FAILED,KILLED 3. Using client API's Since

Pass lib jars when invoking an hadoop program

2015-02-03 Thread xeonmailinglist
Hi, I am trying to run |distcp| using a java class, but I get the error of class not found |DistCpOptions|. I have used the argument |-libjars ./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar| to pass the jar file, but it seems that is not right. How I pass the lib properly? Output:

Re: tools.DistCp: Invalid arguments

2015-02-03 Thread xeonmailinglist
I have found the problem. I started to use `webhdfs` and everything is ok. On 03-02-2015 10:40, xeonmailinglist wrote: What do you mean by no path is given? Even if I launch this command, I get the same error…. What path should I put here? |$ hadoop distcp hdfs://hadoop-coc-1:50070/input1

Re: tools.DistCp: Invalid arguments

2015-02-03 Thread Alexander Alten-Lorenz
Hi, Can you please try webhdfs instead hdfs? - Alexander On 03 Feb 2015, at 12:05, xeonmailinglist xeonmailingl...@gmail.com wrote: Maybe this has to do with this error… I can’t do ls to my own machine using the command below. Can this be related to the other problem? Shouldn't I list

Re: tools.DistCp: Invalid arguments

2015-02-03 Thread Alexander Alten-Lorenz
Ah, good. Cross-posting :) BR, Alex On 03 Feb 2015, at 12:41, xeonmailinglist xeonmailingl...@gmail.com wrote: I have found the problem. I started to use `webhdfs` and everything is ok. On 03-02-2015 10:40, xeonmailinglist wrote: What do you mean by no path is given? Even if I launch

Re: tools.DistCp: Invalid arguments

2015-02-03 Thread Artem Ervits
Another good option is hftp. Artem Ervits On Feb 3, 2015 6:42 AM, xeonmailinglist xeonmailingl...@gmail.com wrote: I have found the problem. I started to use `webhdfs` and everything is ok. On 03-02-2015 10:40, xeonmailinglist wrote: What do you mean by no path is given? Even if I launch

Can not start HA namenode with security enabled

2015-02-03 Thread 郝东
I am converting a secure non-HA cluster into a secure HA cluster. After the configuration and started all the journalnodes, I executed the following commands on the original NameNode: 1. hdfs name -initializeSharedEdits #this step succeeded 2. hadoop-daemon.sh start namenode # this step failed.

QueueMetrics.AppsKilled/Failed metrics and failure reasons

2015-02-03 Thread Suma Shivaprasad
Hello, Was trying to debug reasons for Killed/Failed apps and was checking for the applications that were killed/failed in RM logs - from RMAuditLogger. QueueMetrics.AppsKilled/Failed metrics shows much higher nos i.e ~100. However RMAuditLogger shows 1 or 2 Apps as Killed/Failed in the logs. Is

IBM JAVA and KerberosTestUtils

2015-02-03 Thread Sangamesh Mallayya
Hi All, This is with respect to the JIRA defect: HADOOP-10774 related to kerberose authentication using IBM JAVA. Looks like there are lot of changes haven been done to properly handle the kerbserose authentication using the JIRA defect: HADOOP-9446 for IBM JAVA. But their are still some

Re: Can not start HA namenode with security enabled

2015-02-03 Thread Manoj Samel
Have you added all host specific principals in kerberos database ? Thanks, On Tue, Feb 3, 2015 at 7:59 AM, 郝东 donhof...@163.com wrote: I am converting a secure non-HA cluster into a secure HA cluster. After the configuration and started all the journalnodes, I executed the following commands

Re: Pass lib jars when invoking an hadoop program

2015-02-03 Thread xeonmailinglist
Got it. Here's the solution: ``` vagrant@hadoop-coc-1:~/Programs/hadoop$ export HADOOP_CLASSPATH=share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar; hadoop jar wordcount.jar -libjars $HADOOP_HOME/share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar /input1 /outputmp /output1 ``` On 03-02-2015 14:58,

Re: Pass lib jars when invoking an hadoop program

2015-02-03 Thread xeonmailinglist
Got it. Here’s the solution: |vagrant@hadoop-coc-1:~/Programs/hadoop$ export HADOOP_CLASSPATH=share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar; hadoop jar wordcount.jar -libjars $HADOOP_HOME/share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar /input1 /outputmp /output1 | On 03-02-2015 14:58,

create Job with java code.

2015-02-03 Thread xeonmailinglist
Hi, I want this because I want to create depency between 2 jobs. The first job execute the wordcount example, and the second job copy the output of the wordcount to another HDFS. Therefore, I want to create a job (job 2) that includes the code to copy data to another HDFS. The code is below.

Re: tools.DistCp: Invalid arguments

2015-02-03 Thread xeonmailinglist
Maybe this has to do with this error… I can’t do |ls| to my own machine using the command below. Can this be related to the other problem? Shouldn't I list the files with this command? |vagrant@hadoop-coc-1:~$ hdfs dfs -ls hdfs://192.168.56.100/ ls: Call From hadoop-coc-1/192.168.56.100 to

Re: tools.DistCp: Invalid arguments

2015-02-03 Thread xeonmailinglist
What do you mean by no path is given? Even if I launch this command, I get the same error…. What path should I put here? |$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/input1| Thanks, On 02-02-2015 19:59, Alexander Alten-Lorenz wrote: Have a closer look:

Re: unsubscribe

2015-02-03 Thread Ram Kumar
Check http://hadoop.apache.org/mailing_lists.html#User Regards, Ramkumar Bashyam On Wed, Jan 7, 2015 at 7:01 PM, Kiran Prasad Gorigay kiranprasa...@imimobile.com wrote: unsubscribe

How to rolling upgrade??

2015-02-03 Thread Mr.J
my cluster A, and cluster B. To upgrade to version 2.6 In what order should I upgrade? Journalnode 1 gt;gt; Journalnode 2 gt; Journalnode 3 gt;gt; Namenode Std gt;gt; Namenode Act gt;gt; Datanode ?? Do I also need to upgrade the zookeeper? hadoop-2.4.1 : JournalNode, Namenode, Datanode

unsubscribe

2015-02-03 Thread Kiran Prasad Gorigay
unsubscribe

Re:Re: Can not start HA namenode with security enabled

2015-02-03 Thread 郝东
Hi, I have checked my kerberos database. All the principals are there. By the way, if I did not enable HA, just enable the secure-mode, the Namenode can be started correctly. At 2015-02-04 01:24:21, Manoj Samel manojsamelt...@gmail.com wrote: Have you added all host specific principals