Hi,
I faced a similar problem sometime back.
I think its the network/ communication latency between master and slaves
that is an issue in your case. Try increasing the timeout interval in
hadoop-site.xml.
V.V.Chaitanya Krishna
IIIT,Hyderabad
India
On Thu, Oct 16, 2008 at 4:53 AM, Lucas Di Penti
Besides this, we're also getting this error: "java.lang.OutOfMemoryError: GC
overhead limit exceeded"
For the above error, try increasing the heap size. It worked for me when i
came across the same error in 0.17.0 version.
"We're only running a total of 20 reducers which
I suspect is very low,
hi,
Thank you for the information Steve. :) I never came across this and is very
new :)
V.V.Chaitanya Krishna
IIIT,Hyderabad
India
On Mon, Oct 27, 2008 at 4:10 PM, Steve Loughran <[EMAIL PROTECTED]> wrote:
> chaitanya krishna wrote:
>
>> Hi,
>>
>> If the proble
I forgot to mention that although the number of map tasks are set in the
code as I mentioned before, the actual number of map tasks are not
essentially the same number but is very close to this number.
V.V.Chaitanya Krishna
IIIT,Hyderabad
India
On Sun, Oct 26, 2008 at 4:29 PM, chaitanya krishna
Hi,
In order to have different number of map tasks for each of the jobs, in the
run method of the code , I had the following syntax:
conf.setNumMapTasks(num); // for number of map tasks
conf.setNumReduceTasks(num); // for number of reduce tasks
conf is the JobConf object and num is the number
Hi,
If the problem is due to the OS-level limit on the number of active
threads, then why is the error showing outofmemory exception? Is it an issue
of the heap size available for hadoop?Won't increasing heap size fix this
problem?
Thanks
V.V.Chaitanya Krishna
On Fri, Oct 24, 2008 at 2:42 PM, S
Hi,
Try setting number of map tasks in the program itself. For example, in the
Wordcount example, you can set the number of maptasks in run method as
conf.setNumMapTasks
I hope this answers your first query.
Regards,
V.V.Chaitanya Krishna
IIIT,Hyderabad
On Wed, Jul 16, 2008 at 1:47 AM, Wei Ji
Thanks for the reply. It worked! :)
On Wed, Jul 16, 2008 at 11:45 AM, Shengkai Zhu <[EMAIL PROTECTED]> wrote:
> Replace the hadoop-*-core.jar in datanodes with your jar compiled under
> "jobs"
>
>
> On 7/16/08, chaitanya krishna <[EMAIL PROTECTED]> wrote:
Hi,
I'm using hadoop-0.17.0 and recently, when i stopped and restarted dfs,
the datanodes are being created and soon, they r not present. the logs of
namenode shows the following error:
/
SHUTDOWN_MSG: Shutting down NameNode at 172.16.4
> due to permission issues. a chmod 755 will fix this. you'll need to do this
> with any "permission denied" message that you get associated with this.
>
> Hope this helps!
>
> -SM
> On Thu, Jul 10, 2008 at 10:03 PM, chaitanya krishna <
> [EMAIL PROTECTED]&
; > Make sure you are using the latest version of hadoop. That actually fixed
> > it for me. There was something wrong with the build.xml file in earlier
> > versions that prevented me from being able to get it to work properly.
> Once
> > I upgraded to the latest, it w
Hi,
I faced the similar problem as Sandy. But this time I even had the jdk set
properly.
when i executed:
ant -Dcompile.c++=yes examples
the following was displayed:
Buildfile: build.xml
clover.setup:
clover.info:
[echo]
[echo] Clover not found. Code coverage reports disabled
Hi,
I had a cluster of nodes with a specific set of ips assigned to them and
were working fine. But when the ips were changed, there are no datanodes
being generated, although the tasktrackers are generated well.
when tried to manually create datanode at a specific node using " bin/hadoop
datano
Hi,
I have a font file with .ttf extension which is being used well in Java by
using the following code:
public void getFont(String fontfile, String text)
{
Font font;
try
{
FileInputStream fis = new FileInputStream(fontfile);
font = Font.cre
Hi,
I want to get the "URL" paths of files that are stored in dfs. Is there
any way to get it?
Thank you
Hi,
I wanted to run my own java code in hadoop. The following are the commands
that I executed and errors occurred.
mkdir temp
javac -Xlint -classpath hadoop-0.16.0-core.jar -d temp
GetFeatures.java (GetFeatures.java is the code)
jar -cvf temp.jar temp
bin/hadoop jar
Hi,
In one of my works which require hadoop, I need to constantly append
certain data to files. Is there any way to do it?
Hi,
Is there any way of finding out the output path argument ( that is given
as command-line argument) in the mapper class?
18 matches
Mail list logo