Hi All
I've having some trouble in using the eclipse plugin on a windows XP machine
to connect to the HDFS (hadoop 0.19.0) on a linux server - I'm getting the
error:null message, although the port number etc are correct. Can this be
related to the user information? I've set it to the hadoop user
I was getting similar errors too while running the mapreduce samples. I
fiddled with the hadoop-env.sh (where the HEAPSIZE is specified) and the
hadoop-site.xml files - and rectified it after some trial and error. But I
would like to know if there is a thumb rule for this. Right now, I've a core
will
exit.
What might have been happening here?
Regards
Arijit
2009/2/16 Arijit Mukherjee ariji...@gmail.com
Hi All
I'm trying to create a tiny 2-node cluster (both on linux FC7) with Hadoop
0.19.0 - previously, I was able to install and run hadoop on a single node.
Now I'm trying
Hi All
I'm trying to create a tiny 2-node cluster (both on linux FC7) with Hadoop
0.19.0 - previously, I was able to install and run hadoop on a single node.
Now I'm trying it on 2 nodes - my idea was to put the name node and the job
tracker on separate nodes, and initially use these two as the
node
be sufficient? Why would you need an external storage in a hadoop
cluster? How can I find out what other projects on hadoop are using?
Cheers
Arijit
Dr. Arijit Mukherjee
Principal Member of Technical Staff, Level-II
Connectiva Systems (I) Pvt. Ltd.
J-2, Block GP, Sector V, Salt Lake
Kolkata
One correction - the number 5 in the mail below is my estimation of the
number of nodes we might need. Can this be too small a cluster?
Arijit
Dr. Arijit Mukherjee
Principal Member of Technical Staff, Level-II
Connectiva Systems (I) Pvt. Ltd.
J-2, Block GP, Sector V, Salt Lake
Kolkata 700 091
: Wednesday, September 24, 2008 2:57 PM
To: core-user@hadoop.apache.org
Subject: Re: Questions about Hadoop
Hi,
Arijit Mukherjee wrote:
Hi
We've been thinking of using Hadoop for a decision making system which
will analyze telecom-related data from various sources to take certain
decisions
Thanx again Enis. I'll have a look at Pig and Hive.
Regards
Arijit
Dr. Arijit Mukherjee
Principal Member of Technical Staff, Level-II
Connectiva Systems (I) Pvt. Ltd.
J-2, Block GP, Sector V, Salt Lake
Kolkata 700 091, India
Phone: +91 (0)33 23577531/32 x 107
http://www.connectivasystems.com
That's a very good overview Paco - thanx for that. I might get back to
you with more queries about cascade etc. at some time - hope you
wouldn't mind.
Regards
Arijit
Dr. Arijit Mukherjee
Principal Member of Technical Staff, Level-II
Connectiva Systems (I) Pvt. Ltd.
J-2, Block GP, Sector V, Salt
Hi
Most likely, it's due to login permissions. Have you set up ssh for
accessing the nodes? This page might be helpful -
http://tinyurl.com/6lz6o3 - contains detailed explanation of the steps
you should follow.
Hope this helps
Cheers
Arijit
Dr. Arijit Mukherjee
Principal Member of Technical
, but thought someone here may be
able to help as well.
Regards
Arijit
Dr. Arijit Mukherjee
Principal Member of Technical Staff, Level-II
Connectiva Systems (I) Pvt. Ltd.
J-2, Block GP, Sector V, Salt Lake
Kolkata 700 091, India
Phone: +91 (0)33 23577531/32 x 107
http://www.connectivasystems.com
not create the Java virtual machine.
Does the TaskTracker need more memory? The problem is if I increase the
heap size in HADOOP_OPTS, all of the other hadoop processes start
throwing the same error.
Can anyone point me to the right direction please?
Thanx in advance
Arijit
Dr. Arijit Mukherjee
12 matches
Mail list logo