RE: get job data in command line in MRv2

2014-01-02 Thread German Florez-Larrahondo
Perhaps some of the data you require can be obtained with the mapred tool
(here I’m using 2.2.0)

 

e.g.

 

 

htf@ivygerman:~/hadoop/bin$ ./mapred job

Usage: CLI command args

[-submit job-file]

[-status job-id]

[-counter job-id group-name counter-name]

[-kill job-id]

[-set-priority job-id priority]. Valid values for priorities
are: VERY_HIGH HIGH NORMAL LOW VERY_LOW

[-events job-id from-event-# #-of-events]

[-history jobHistoryFile]

[-list [all]]

[-list-active-trackers]

[-list-blacklisted-trackers]

[-list-attempt-ids job-id task-type task-state]. Valid values
for task-type are REDUCE MAP. Valid values for task-state are running,
completed

[-kill-task task-attempt-id]

[-fail-task task-attempt-id]

[-logs job-id task-attempt-id]

 

Generic options supported are

-conf configuration file specify an application configuration file

-D property=valueuse value for given property

-fs local|namenode:port  specify a namenode

-jt local|jobtracker:portspecify a job tracker

-files comma separated list of filesspecify comma separated files to
be copied to the map reduce cluster

-libjars comma separated list of jarsspecify comma separated jar files
to include in the classpath.

-archives comma separated list of archivesspecify comma separated
archives to be unarchived on the compute machines.

 

The general command line syntax is

bin/hadoop command [genericOptions] [commandOptions]

 

htf@ivygerman:~/hadoop/bin$ ./mapred job -list

14/01/02 11:29:54 INFO client.RMProxy: Connecting to ResourceManager at
ivy02.spa.sarc.sas/172.31.204.11:8032

Total jobs:1

  JobId  State   StartTime  UserName
Queue  Priority   UsedContainers  RsvdContainers  UsedMem
RsvdMem NeededMem AM info

job_1388162829887_0050RUNNING   1388678456812   htf
root.htfNORMAL  300  20  307712M
20480M   328192M
ivygerman-192-168-10-12.htfcluster.org:8088/proxy/application_1388162829887_
0050/

htf@ivygerman-192-168-10-12:~/hadoop/bin$

 

 

 

From: Azuryy Yu [mailto:azury...@gmail.com] 
Sent: Monday, December 30, 2013 10:42 PM
To: user@hadoop.apache.org
Subject: Re: get job data in command line in MRv2

 

Generally, MRv2 indicates Yarn. you can try:

 

yarn application 

 

then there are full help lists.

 

On Tue, Dec 31, 2013 at 12:32 PM, 小网客 smallnetvisi...@foxmail.com
wrote:

ui or hadoop job command like:hadoop job -list

 

--

-

BestWishes!

小网客

Blog: http://snv.iteye.com/ http://snv.iteye.com/

Email:1134687...@qq.com mailto:email%3a1134687...@qq.com 

 

 

 

-- Original --

From:  xeon;psdc1...@gmail.com;

Date:  Tue, Dec 31, 2013 03:19 AM

To:  useruser@hadoop.apache.org; 

Subject:  get job data in command line in MRv2

 

Hi,

I would like to know if the MRv2 provide the following commands through 
the bash command line:
- get the number of jobs running?
- get the percentage of job completion of jobs?
- get the number of jobs that are waiting to be submitted?

-- 
Thanks,

.

 



get job data in command line in MRv2

2013-12-30 Thread xeon

Hi,

I would like to know if the MRv2 provide the following commands through 
the bash command line:

- get the number of jobs running?
- get the percentage of job completion of jobs?
- get the number of jobs that are waiting to be submitted?

--
Thanks,



Re: get job data in command line in MRv2

2013-12-30 Thread Azuryy Yu
Generally, MRv2 indicates Yarn. you can try:

yarn application

then there are full help lists.


On Tue, Dec 31, 2013 at 12:32 PM, 小网客 smallnetvisi...@foxmail.com wrote:

 ui or hadoop job command like:hadoop job -list

 --
 -
 BestWishes!
 小网客
 Blog:http://snv.iteye.com/
 Email:1134687...@qq.com



 -- Original --
 *From: * xeon;psdc1...@gmail.com;
 *Date: * Tue, Dec 31, 2013 03:19 AM
 *To: * useruser@hadoop.apache.org;
 *Subject: * get job data in command line in MRv2

 Hi,

 I would like to know if the MRv2 provide the following commands through
 the bash command line:
 - get the number of jobs running?
 - get the percentage of job completion of jobs?
 - get the number of jobs that are waiting to be submitted?

 --
 Thanks,

 .