Re: Eclipse plugin

2009-03-02 Thread Arijit Mukherjee
Hi All I've having some trouble in using the eclipse plugin on a windows XP machine to connect to the HDFS (hadoop 0.19.0) on a linux server - I'm getting the error:null message, although the port number etc are correct. Can this be related to the user information? I've set it to the hadoop user

Re: Could not reserve enough space for heap in JVM

2009-02-25 Thread Arijit Mukherjee
I was getting similar errors too while running the mapreduce samples. I fiddled with the hadoop-env.sh (where the HEAPSIZE is specified) and the hadoop-site.xml files - and rectified it after some trial and error. But I would like to know if there is a thumb rule for this. Right now, I've a core

Re: Cannot execute the start-mapred script

2009-02-17 Thread Arijit Mukherjee
will exit. What might have been happening here? Regards Arijit 2009/2/16 Arijit Mukherjee ariji...@gmail.com Hi All I'm trying to create a tiny 2-node cluster (both on linux FC7) with Hadoop 0.19.0 - previously, I was able to install and run hadoop on a single node. Now I'm trying

Cannot execute the start-mapred script

2009-02-16 Thread Arijit Mukherjee
Hi All I'm trying to create a tiny 2-node cluster (both on linux FC7) with Hadoop 0.19.0 - previously, I was able to install and run hadoop on a single node. Now I'm trying it on 2 nodes - my idea was to put the name node and the job tracker on separate nodes, and initially use these two as the

Hadoop hardware specs

2008-11-04 Thread Arijit Mukherjee
node be sufficient? Why would you need an external storage in a hadoop cluster? How can I find out what other projects on hadoop are using? Cheers Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata

RE: Hadoop hardware specs

2008-11-04 Thread Arijit Mukherjee
One correction - the number 5 in the mail below is my estimation of the number of nodes we might need. Can this be too small a cluster? Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091

RE: Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
: Wednesday, September 24, 2008 2:57 PM To: core-user@hadoop.apache.org Subject: Re: Questions about Hadoop Hi, Arijit Mukherjee wrote: Hi We've been thinking of using Hadoop for a decision making system which will analyze telecom-related data from various sources to take certain decisions

RE: Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
Thanx again Enis. I'll have a look at Pig and Hive. Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivasystems.com

RE: Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
That's a very good overview Paco - thanx for that. I might get back to you with more queries about cascade etc. at some time - hope you wouldn't mind. Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt

RE: NameNode formatting issues in 1.16.4 and higher

2008-08-26 Thread Arijit Mukherjee
Hi Most likely, it's due to login permissions. Have you set up ssh for accessing the nodes? This page might be helpful - http://tinyurl.com/6lz6o3 - contains detailed explanation of the steps you should follow. Hope this helps Cheers Arijit Dr. Arijit Mukherjee Principal Member of Technical

Hadoop eclipse plugin

2008-08-25 Thread Arijit Mukherjee
, but thought someone here may be able to help as well. Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivasystems.com

Hadoop configuration problem

2008-08-19 Thread Arijit Mukherjee
not create the Java virtual machine. Does the TaskTracker need more memory? The problem is if I increase the heap size in HADOOP_OPTS, all of the other hadoop processes start throwing the same error. Can anyone point me to the right direction please? Thanx in advance Arijit Dr. Arijit Mukherjee