Re: Map works well, but Redue failed
I did the following steps: 1.stop-all.sh 2.Delete the tmp folder 3.format namenode 4.start-all.sh The problem has gone. Not sure what's the root cause. Best Regards, 2012/6/16 Abhishek > Hi raj, > > I think you should increase the reducer worker threads,to fetch the map > output. > > Regards > Abhishek. > > Sent from my iPhone > > On Jun 15, 2012, at 9:42 PM, Raj Vishwanathan wrote: > > > Most probably you have a network problem. Check your hostname and IP > address mapping > > > > > > > >> > >> From: Yongwei Xing > >> To: common-user@hadoop.apache.org > >> Sent: Thursday, June 14, 2012 10:15 AM > >> Subject: Map works well, but Redue failed > >> > >> Hi all > >> > >> I run a simple sort program, however, I meet such error like below. > >> > >> 12/06/15 01:13:17 WARN mapred.JobClient: Error reading task outputServer > >> returned HTTP response code: 403 for URL: > >> > http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stdout > >> 12/06/15 01:13:18 WARN mapred.JobClient: Error reading task outputServer > >> returned HTTP response code: 403 for URL: > >> > http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stderr > >> 12/06/15 01:13:20 INFO mapred.JobClient: map 50% reduce 0% > >> 12/06/15 01:13:23 INFO mapred.JobClient: map 100% reduce 0% > >> 12/06/15 01:14:19 INFO mapred.JobClient: Task Id : > >> attempt_201206150102_0002_m_00_2, Status : FAILED > >> Too many fetch-failures > >> 12/06/15 01:14:20 WARN mapred.JobClient: Error reading task outputServer > >> returned HTTP response code: 403 for URL: > >> > http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_00_2&filter=stdout > >> > >> Does anyone know what's the reason and how to resolve it? > >> > >> Best Regards, > >> > >> -- > >> Welcome to my ET Blog http://www.jdxyw.com > >> > >> > -- Welcome to my ET Blog http://www.jdxyw.com
Re: Map works well, but Redue failed
Hi raj, I think you should increase the reducer worker threads,to fetch the map output. Regards Abhishek. Sent from my iPhone On Jun 15, 2012, at 9:42 PM, Raj Vishwanathan wrote: > Most probably you have a network problem. Check your hostname and IP address > mapping > > > >> >> From: Yongwei Xing >> To: common-user@hadoop.apache.org >> Sent: Thursday, June 14, 2012 10:15 AM >> Subject: Map works well, but Redue failed >> >> Hi all >> >> I run a simple sort program, however, I meet such error like below. >> >> 12/06/15 01:13:17 WARN mapred.JobClient: Error reading task outputServer >> returned HTTP response code: 403 for URL: >> http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stdout >> 12/06/15 01:13:18 WARN mapred.JobClient: Error reading task outputServer >> returned HTTP response code: 403 for URL: >> http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stderr >> 12/06/15 01:13:20 INFO mapred.JobClient: map 50% reduce 0% >> 12/06/15 01:13:23 INFO mapred.JobClient: map 100% reduce 0% >> 12/06/15 01:14:19 INFO mapred.JobClient: Task Id : >> attempt_201206150102_0002_m_00_2, Status : FAILED >> Too many fetch-failures >> 12/06/15 01:14:20 WARN mapred.JobClient: Error reading task outputServer >> returned HTTP response code: 403 for URL: >> http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_00_2&filter=stdout >> >> Does anyone know what's the reason and how to resolve it? >> >> Best Regards, >> >> -- >> Welcome to my ET Blog http://www.jdxyw.com >> >>
Re: Map works well, but Redue failed
Most probably you have a network problem. Check your hostname and IP address mapping > > From: Yongwei Xing >To: common-user@hadoop.apache.org >Sent: Thursday, June 14, 2012 10:15 AM >Subject: Map works well, but Redue failed > >Hi all > >I run a simple sort program, however, I meet such error like below. > >12/06/15 01:13:17 WARN mapred.JobClient: Error reading task outputServer >returned HTTP response code: 403 for URL: >http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stdout >12/06/15 01:13:18 WARN mapred.JobClient: Error reading task outputServer >returned HTTP response code: 403 for URL: >http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stderr >12/06/15 01:13:20 INFO mapred.JobClient: map 50% reduce 0% >12/06/15 01:13:23 INFO mapred.JobClient: map 100% reduce 0% >12/06/15 01:14:19 INFO mapred.JobClient: Task Id : >attempt_201206150102_0002_m_00_2, Status : FAILED >Too many fetch-failures >12/06/15 01:14:20 WARN mapred.JobClient: Error reading task outputServer >returned HTTP response code: 403 for URL: >http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_00_2&filter=stdout > >Does anyone know what's the reason and how to resolve it? > >Best Regards, > >-- >Welcome to my ET Blog http://www.jdxyw.com > > >
Map works well, but Redue failed
Hi all I run a simple sort program, however, I meet such error like below. 12/06/15 01:13:17 WARN mapred.JobClient: Error reading task outputServer returned HTTP response code: 403 for URL: http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stdout 12/06/15 01:13:18 WARN mapred.JobClient: Error reading task outputServer returned HTTP response code: 403 for URL: http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_01_1&filter=stderr 12/06/15 01:13:20 INFO mapred.JobClient: map 50% reduce 0% 12/06/15 01:13:23 INFO mapred.JobClient: map 100% reduce 0% 12/06/15 01:14:19 INFO mapred.JobClient: Task Id : attempt_201206150102_0002_m_00_2, Status : FAILED Too many fetch-failures 12/06/15 01:14:20 WARN mapred.JobClient: Error reading task outputServer returned HTTP response code: 403 for URL: http://192.168.1.106:50060/tasklog?plaintext=true&attemptid=attempt_201206150102_0002_m_00_2&filter=stdout Does anyone know what's the reason and how to resolve it? Best Regards, -- Welcome to my ET Blog http://www.jdxyw.com