Re: kerberos security enabled and hadoop/hdfs/mapred users

2012-05-03 Thread Ravi Prakash
Yes Koert! That is correct! On Thu, May 3, 2012 at 6:08 PM, Koert Kuipers wrote: > do i understand it correctly that with kerberos enabled the mappers and > reducers will be "run as" the actual user that started them? as opposed to > the user that runs the tasktracker, which is mapred or hadoop

Re: Debug MR tasks impossible.

2012-03-29 Thread Ravi Prakash
, Ravi Prakash wrote: > Hi Pedro, > > I know its sub-optimal but you should be able to put in as many > System.out.println / log messages as you want and you should be able to see > them in stdout, and syslog files. Which version of hadoop are you using? > > > > > On

Re: Debug MR tasks impossible.

2012-03-29 Thread Ravi Prakash
Hi Pedro, I know its sub-optimal but you should be able to put in as many System.out.println / log messages as you want and you should be able to see them in stdout, and syslog files. Which version of hadoop are you using? On Thu, Mar 29, 2012 at 10:33 AM, Pedro Costa wrote: > Hi, > > I'm try

Re: Other than hadoop

2012-01-30 Thread Ravi Prakash
Sector-Sphere On Mon, Jan 30, 2012 at 4:24 PM, Ronald Petty wrote: > R.V., > > Are you looking for the platforms that due distributed computation or the > larger ecosystems like programming apis, etc.? > > Here are some platforms: > > C-Squared > Globus > Condor > > Here are some libraries: > >

Re: enabling capacity scheduler in 0.24

2012-01-11 Thread Ravi Prakash
You probably want to add a capacity-scheduler.xml to define the queues On Tue, Jan 10, 2012 at 2:46 PM, Ann Pal wrote: > Hi > Is the following the only steps to turn on capacity scheduler? > [1] Edit conf/yarn-site.xml to include: > yarn.resourcemanager.scheduler.class - > org.apache.hadoop.yar

Re: Yarn Container Limit

2012-01-10 Thread Ravi Prakash
>From what I know the number of containers will depend on the amount of resources your node has. If it has 8 Gb RAM and each container has 2 Gb, then there'll be a maximum of 4 containers. On Tue, Jan 10, 2012 at 5:44 AM, raghavendhra rahul < raghavendhrara...@gmail.com> wrote: > Hi, >

Re: Cannot start yarn daemons

2012-01-09 Thread Ravi Prakash
Hi, Clearly the jar file containing the class "org/apache/hadoop/conf/Configuration" is not available on the CLASSPATH. Did you build hadoop properly? The way I would usually check this is: 1. I have a shell script findclass #!/bin/sh LOOK_FOR=$1 if [ -z $LOOK_FOR ]; then echo -e "Usage: