Re: Simple MapReduce example failed

2008-10-28 Thread chaitanya krishna
Hi, I faced a similar problem sometime back. I think its the network/ communication latency between master and slaves that is an issue in your case. Try increasing the timeout interval in hadoop-site.xml. V.V.Chaitanya Krishna IIIT,Hyderabad India On Thu, Oct 16, 2008 at 4:53 AM, Lucas Di Penti

Re: Expected disk space use for files in mapred/local?

2008-10-27 Thread chaitanya krishna
Besides this, we're also getting this error: "java.lang.OutOfMemoryError: GC overhead limit exceeded" For the above error, try increasing the heap size. It worked for me when i came across the same error in 0.17.0 version. "We're only running a total of 20 reducers which I suspect is very low,

Re: HELP: Namenode Startup Failed with an OutofMemoryError

2008-10-27 Thread chaitanya krishna
hi, Thank you for the information Steve. :) I never came across this and is very new :) V.V.Chaitanya Krishna IIIT,Hyderabad India On Mon, Oct 27, 2008 at 4:10 PM, Steve Loughran <[EMAIL PROTECTED]> wrote: > chaitanya krishna wrote: > >> Hi, >> >> If the proble

Re: Help: How to change number of mappers in Hadoop streaming?

2008-10-26 Thread chaitanya krishna
I forgot to mention that although the number of map tasks are set in the code as I mentioned before, the actual number of map tasks are not essentially the same number but is very close to this number. V.V.Chaitanya Krishna IIIT,Hyderabad India On Sun, Oct 26, 2008 at 4:29 PM, chaitanya krishna

Re: Help: How to change number of mappers in Hadoop streaming?

2008-10-26 Thread chaitanya krishna
Hi, In order to have different number of map tasks for each of the jobs, in the run method of the code , I had the following syntax: conf.setNumMapTasks(num); // for number of map tasks conf.setNumReduceTasks(num); // for number of reduce tasks conf is the JobConf object and num is the number

Re: HELP: Namenode Startup Failed with an OutofMemoryError

2008-10-25 Thread chaitanya krishna
Hi, If the problem is due to the OS-level limit on the number of active threads, then why is the error showing outofmemory exception? Is it an issue of the heap size available for hadoop?Won't increasing heap size fix this problem? Thanks V.V.Chaitanya Krishna On Fri, Oct 24, 2008 at 2:42 PM, S

Re: Two questions about hadoop

2008-07-16 Thread chaitanya krishna
Hi, Try setting number of map tasks in the program itself. For example, in the Wordcount example, you can set the number of maptasks in run method as conf.setNumMapTasks I hope this answers your first query. Regards, V.V.Chaitanya Krishna IIIT,Hyderabad On Wed, Jul 16, 2008 at 1:47 AM, Wei Ji

Re: Reg: Problem in Build Versions of Hadoop-0.17.0

2008-07-16 Thread chaitanya krishna
Thanks for the reply. It worked! :) On Wed, Jul 16, 2008 at 11:45 AM, Shengkai Zhu <[EMAIL PROTECTED]> wrote: > Replace the hadoop-*-core.jar in datanodes with your jar compiled under > "jobs" > > > On 7/16/08, chaitanya krishna <[EMAIL PROTECTED]> wrote:

Reg: Problem in Build Versions of Hadoop-0.17.0

2008-07-15 Thread chaitanya krishna
Hi, I'm using hadoop-0.17.0 and recently, when i stopped and restarted dfs, the datanodes are being created and soon, they r not present. the logs of namenode shows the following error: / SHUTDOWN_MSG: Shutting down NameNode at 172.16.4

Re: Compiling Word Count in C++ : Hadoop Pipes

2008-07-11 Thread chaitanya krishna
> due to permission issues. a chmod 755 will fix this. you'll need to do this > with any "permission denied" message that you get associated with this. > > Hope this helps! > > -SM > On Thu, Jul 10, 2008 at 10:03 PM, chaitanya krishna < > [EMAIL PROTECTED]&

Re: Compiling Word Count in C++ : Hadoop Pipes

2008-07-10 Thread chaitanya krishna
; > Make sure you are using the latest version of hadoop. That actually fixed > > it for me. There was something wrong with the build.xml file in earlier > > versions that prevented me from being able to get it to work properly. > Once > > I upgraded to the latest, it w

Re: Compiling Word Count in C++ : Hadoop Pipes

2008-07-10 Thread chaitanya krishna
Hi, I faced the similar problem as Sandy. But this time I even had the jdk set properly. when i executed: ant -Dcompile.c++=yes examples the following was displayed: Buildfile: build.xml clover.setup: clover.info: [echo] [echo] Clover not found. Code coverage reports disabled

what are the issues to be taken care of when the ip(s) of nodes are changed

2008-05-20 Thread chaitanya krishna
Hi, I had a cluster of nodes with a specific set of ips assigned to them and were working fine. But when the ips were changed, there are no datanodes being generated, although the tasktrackers are generated well. when tried to manually create datanode at a specific node using " bin/hadoop datano

How to use font files in hadoop

2008-05-12 Thread chaitanya krishna
Hi, I have a font file with .ttf extension which is being used well in Java by using the following code: public void getFont(String fontfile, String text) { Font font; try { FileInputStream fis = new FileInputStream(fontfile); font = Font.cre

newbie how to get url paths of files in HDFS

2008-05-08 Thread chaitanya krishna
Hi, I want to get the "URL" paths of files that are stored in dfs. Is there any way to get it? Thank you

ClassNotFoundException while running jar file

2008-05-01 Thread chaitanya krishna
Hi, I wanted to run my own java code in hadoop. The following are the commands that I executed and errors occurred. mkdir temp javac -Xlint -classpath hadoop-0.16.0-core.jar -d temp GetFeatures.java (GetFeatures.java is the code) jar -cvf temp.jar temp bin/hadoop jar

How to append files in hadoop dfs?

2008-04-30 Thread chaitanya krishna
Hi, In one of my works which require hadoop, I need to constantly append certain data to files. Is there any way to do it?

Reg: How to pass the output path argument to the mapper

2008-04-29 Thread chaitanya krishna
Hi, Is there any way of finding out the output path argument ( that is given as command-line argument) in the mapper class?