how to run programs present in the test folder

2009-12-01 Thread Siddu
Hi all , I am interested in the exploring the test folder . which is present in src/test/org/apache/hadoop/hdfs/* Please can some one give me the command line syntax to run those programs ? and building the code by ant command doesnt build the test folder by default . Is there any place(buil

Re: "dfsadmin -report" says that i need "Superuser privilege". What? :)

2009-12-01 Thread Siddu
On Tue, Dec 1, 2009 at 7:34 PM, pavel kolodin wrote: > had...@hadoopmaster ~ $ ls -l hadoop-0.20.1 > total 4876 > -rw-rw-r-- 1 hadoop hadoop 344093 Sep 1 20:44 CHANGES.txt > -rw-rw-r-- 1 hadoop hadoop 13366 Sep 1 20:44 LICENSE.txt > -rw-rw-r-- 1 hadoop hadoop 101 Sep 1 20:44 NOTICE.tx

Re: "dfsadmin -report" says that i need "Superuser privilege". What? :)

2009-12-01 Thread Siddu
On Tue, Dec 1, 2009 at 7:17 PM, pavel kolodin wrote: > > Anyways, as per HADOOP-2659, dfsadmin are only meant for admin. Thanks! >> > > Who is "admin" for hadoop? (-; I have 2 users in OS: "hadoop" + "root" and > i am running all commands as "hadoop" (-; > Can you tell us the "ENTIRE command" yo

Re: Please help me to understand this error messages (-;

2009-11-30 Thread Siddu
On Tue, Dec 1, 2009 at 2:55 AM, pavel kolodin wrote: > > Namenode won't start with this messages: > > hadoop-0.20.1/logs/hadoop-hadoop-namenode-hadoop_master.log: > > http://pastebin.com/m359b9e24 > look carefully at below lines . They could be of some help 2009-11-30 16:57:36,102 ERROR org.ap

Re: call trace help

2009-11-29 Thread Siddu
y of California, Santa Cruz > > > On Sat, Nov 28, 2009 at 11:20 PM, Siddu wrote: > > > Hi all, > > > > I am interested to see the each and every call trace if i issue a command > > > > for ex : $bin/hadoop dfs -copyFromLocal /tmp/file.txt > /user/hadoop

call trace help

2009-11-28 Thread Siddu
Hi all, I am interested to see the each and every call trace if i issue a command for ex : $bin/hadoop dfs -copyFromLocal /tmp/file.txt /user/hadoop/file.txt or while running a M/R job . Is there any command to sprinkle the logs at the begin of each and every function and build the source

Re: hadoop-0.20.1\src\hdfs\org\apache\hadoop\hdfs\server

2009-11-28 Thread Siddu
On Sun, Nov 29, 2009 at 5:41 AM, dzisaacs wrote: > > why is this folder called "server" > The NameNode and DataNode component basically act as the server . Hence the name is my guess ! > > what is this folder for? > -- > View this message in context: > http://old.nabble.com/hadoop-0.20.1%5C

Re: Building Hadoop from Source ?

2009-11-28 Thread Siddu
On Tue, Nov 24, 2009 at 6:10 PM, Siddu wrote: > > > On Tue, Nov 24, 2009 at 6:01 PM, Steve Loughran wrote: > >> Siddu wrote: >> >>> On Thu, Nov 12, 2009 at 11:50 PM, Stephen Watt wrote: >>> >>> Hi Sid >>>> >>>> Check o

Re: log files on the cluster?

2009-11-26 Thread Siddu
On Fri, Nov 27, 2009 at 6:28 AM, Mark Kerzner wrote: > Hi, > > it is probably described somewhere in the manuals, but > > > 1. Where are the log files, especially those that show my > System.out.println() and errors; and > Look at the logs directory ... > 2. Do I need to log in to every ma

Re: Processing 10MB files in Hadoop

2009-11-26 Thread Siddu
On Thu, Nov 26, 2009 at 5:32 PM, Cubic wrote: > Hi list. > > I have small files containing data that has to be processed. A file > can be small, even down to 10MB (but it can me also 100-600MB large) > and contains at least 3 records to be processed. > Processing one record can take 30 second

Re: to get hadoop working around with multiple users on the same instance

2009-11-24 Thread Siddu
behavior is unchanged. >Switching from one parameter value to the other does not change the > mode, >owner or group of files or directories. > > > > On Fri, Nov 20, 2009 at 1:46 AM, Jeff Zhang wrote: > > > On Fri, Nov 20, 2009 at 4:39 PM, Siddu wrote: >

Re: Building Hadoop from Source ?

2009-11-24 Thread Siddu
On Tue, Nov 24, 2009 at 6:01 PM, Steve Loughran wrote: > Siddu wrote: > >> On Thu, Nov 12, 2009 at 11:50 PM, Stephen Watt wrote: >> >> Hi Sid >>> >>> Check out the "Building" section in this link - >>> http://wiki.apache.org/hado

Re: Building Hadoop from Source ?

2009-11-24 Thread Siddu
resolve] :: resolving dependencies :: org.apache.hadoop#Hadoop;work...@sushanth-laptop [ivy:resolve] confs: [common] Please help me out ! > Kind regards > Steve Watt > > > > From: > Siddu > To: > common-user@hadoop.apache.org > Date: > 11/12/2009 12:14 PM > Subject

Re: to get hadoop working around with multiple users on the same instance

2009-11-20 Thread Siddu
. It just one machine am experimenting on (pseudo distributed file system) then you can run dfs shell command in the client's machine > > > Jeff Zhang > > > > On Fri, Nov 20, 2009 at 3:38 PM, Siddu wrote: > > > Hello all, > > > > I am not sure if the qu

to get hadoop working around with multiple users on the same instance

2009-11-19 Thread Siddu
Hello all, I am not sure if the question is framed right ! Lets say user1 launches an instance of hadoop on *single node* , and hence he has permission to create,delete files on hdfs or launch M/R jobs . now what should i do if user2 wants to use the same instance of hadoop which is launched by

Re: Hadoop Cluster Error

2009-11-15 Thread Siddu
On Fri, Nov 13, 2009 at 6:05 PM, radar.sxl wrote: > > When I run a hadoop project with Eclipse-plugin or Cygwin, it's OK. But > when > I run in Hadoop cluster, it's error. So strange questions, I do not kown > why! > > OK: > 09/11/13 15:03:59 INFO mapred.JobClient: Running job: job_200911131503_0

Re: common reasons a map task would fail on a distributed cluster but not locally?

2009-11-15 Thread Siddu
On Sun, Nov 15, 2009 at 1:03 AM, Mike Kendall wrote: > for some reason i never tried lowering my number of map and reduce tasks > until now. looks like i need to reconfigure my cluster since it runs fine > with only 3 map tasks and 3 reduce tasks. > > :X > > On Sat, Nov 14, 2009 at 11:22 AM, Mik

Building Hadoop from Source ?

2009-11-12 Thread Siddu
Hi all, I want to build hadoop from source rather than downloading the already built tar ball. Can someone please give me the steps or link to any pointers please Thanks in advance -- Regards, ~Sid~ I have never met a man so ignorant that i couldn't learn something from him

stdout logs ?

2009-11-10 Thread Siddu
Hi all In src/contrib/data_join/src/java/org/apache/hadoop/contrib/utils/join/DataJoinJob.java i found a couple of println statements (shown below )which are getting executed when submitted for a job . I am not sure to which stdout they are printing ? I searched in logs/* but dint find it ?

Re: how to output to stdout

2009-11-08 Thread Siddu
2009/11/8 Gang Luo : > Hi everyone, > To check whether my hadoop program goes as I expected, I add some "println" > in my program. But it seems they don't work. Somebody gives me some > suggestion how to output something to stdout? Thanks. > look out for a folder called logs/userlogs//attempt_20

Re: Project ideas !

2009-10-17 Thread Siddu
in hadoop > 7) Document classification > 8) Document Ranking > > Infact all batch applications that can be parallelised are suitable for hadoop. > G Sudha Sadasivam > > > > --- On Wed, 10/14/09, Siddu wrote: > > > From: Siddu > Subject: Project ideas ! > T

Project ideas !

2009-10-14 Thread Siddu
Hello Hadoop Users, Me and another friend of mine are looking out for some of the project ideas based on hadoop as a part of our curriculum . Can you give us some pointers please Thanks in advance ! Regards, ~Sid~