Re: Eclipse plugin

2009-03-02 Thread Arijit Mukherjee
Hi All I've having some trouble in using the eclipse plugin on a windows XP machine to connect to the HDFS (hadoop 0.19.0) on a linux server - I'm getting the error:null message, although the port number etc are correct. Can this be related to the user information? I've set it to the hadoop user o

Re: Could not reserve enough space for heap in JVM

2009-02-25 Thread Arijit Mukherjee
Thanx guys - I've a clearer picture now:-) Cheers Arijit 2009/2/26 souravm > In 32 bit machine u r limited to 4 gb heap size at jvm level per machine > > - Original Message - > From: Arijit Mukherjee > To: core-user@hadoop.apache.org > Sent: Wed Feb 25 21:2

Re: Could not reserve enough space for heap in JVM

2009-02-25 Thread Arijit Mukherjee
APSIZE to 4GB, it thrown the > exception refered in this thread. how can i make full use of my mem. thx. > > 2009/2/26 Arijit Mukherjee > > > I was getting similar errors too while running the mapreduce samples. I > > fiddled with the hadoop-env.sh (where the HEAPSIZE is

Re: Could not reserve enough space for heap in JVM

2009-02-25 Thread Arijit Mukherjee
. My machine's > memory size is 16G. but when i set HADOOP_HEAPSIZE to 4GB, it thrown the > exception refered in this thread. how can i make full use of my mem. thx. > > 2009/2/26 Arijit Mukherjee > > > I was getting similar errors too while running the mapreduce samples

Re: Could not reserve enough space for heap in JVM

2009-02-25 Thread Arijit Mukherjee
I was getting similar errors too while running the mapreduce samples. I fiddled with the hadoop-env.sh (where the HEAPSIZE is specified) and the hadoop-site.xml files - and rectified it after some trial and error. But I would like to know if there is a thumb rule for this. Right now, I've a core du

Re: Cannot execute the start-mapred script

2009-02-17 Thread Arijit Mukherjee
ClassInternal(ClassLoader.java:320) > blueberry: Could not find the main class: > Could_not_reserve_enough_space_for_the_card_marking_array. Program will > exit. What might have been happening here? Regards Arijit 2009/2/16 Arijit Mukherjee > Hi All > > I'm trying to create a tiny 2-node cluster (both on linux FC7

Cannot execute the start-mapred script

2009-02-16 Thread Arijit Mukherjee
Hi All I'm trying to create a tiny 2-node cluster (both on linux FC7) with Hadoop 0.19.0 - previously, I was able to install and run hadoop on a single node. Now I'm trying it on 2 nodes - my idea was to put the name node and the job tracker on separate nodes, and initially use these two as the da

RE: Hadoop hardware specs

2008-11-04 Thread Arijit Mukherjee
One correction - the number 5 in the mail below is my estimation of the number of nodes we might need. Can this be too small a cluster? Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091

Hadoop hardware specs

2008-11-04 Thread Arijit Mukherjee
ks on each node be sufficient? Why would you need an external storage in a hadoop cluster? How can I find out what other projects on hadoop are using? Cheers Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt La

RE: Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
That's a very good overview Paco - thanx for that. I might get back to you with more queries about cascade etc. at some time - hope you wouldn't mind. Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sect

RE: Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
Thanx again Enis. I'll have a look at Pig and Hive. Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivasystem

RE: Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
Sent: Wednesday, September 24, 2008 2:57 PM To: core-user@hadoop.apache.org Subject: Re: Questions about Hadoop Hi, Arijit Mukherjee wrote: > Hi > > We've been thinking of using Hadoop for a decision making system which > will analyze telecom-related data from various s

Questions about Hadoop

2008-09-24 Thread Arijit Mukherjee
o create a workflow like functionality with MapReduce? Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivasystems.com

RE: NameNode formatting issues in 1.16.4 and higher

2008-08-26 Thread Arijit Mukherjee
Hi Most likely, it's due to login permissions. Have you set up ssh for accessing the nodes? This page might be helpful - http://tinyurl.com/6lz6o3 - contains detailed explanation of the steps you should follow. Hope this helps Cheers Arijit Dr. Arijit Mukherjee Principal Member of Tech

RE: Hadoop eclipse plugin

2008-08-25 Thread Arijit Mukherjee
ipse. Something to do with the parameters on the advanced tab? Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivasystems.com

RE: Hadoop eclipse plugin

2008-08-25 Thread Arijit Mukherjee
e 552 spam score (5.6) exceeded threshold error everytime I replied to this message. Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivas

Hadoop eclipse plugin

2008-08-25 Thread Arijit Mukherjee
raised these questions to the plugin forum, but thought someone here may be able to help as well. Regards Arijit Dr. Arijit Mukherjee Principal Member of Technical Staff, Level-II Connectiva Systems (I) Pvt. Ltd. J-2, Block GP, Sector V, Salt Lake Kolkata 700 091, India Phone: +91 (0)33 23577531/32 x 107 http://www.connectivasystems.com

Hadoop configuration problem

2008-08-19 Thread Arijit Mukherjee
p localhost: Could not create the Java virtual machine. Does the TaskTracker need more memory? The problem is if I increase the heap size in HADOOP_OPTS, all of the other hadoop processes start throwing the same error. Can anyone point me to the right direction please? Thanx in advance Arijit Dr