hadoop file system error

2008-06-18 Thread
Dears, I use hadoop-0.16.4 to do some work and found a error which i can't get the reasons. The scenario is like this: In the reduce step, instead of using OutputCollector to write result, i use FSDataOutputStream to write result to files on HDFS(becouse i want to split the result by some rules).

Re: hadoop file system error

2008-06-18 Thread
i'm sure i close all the files in the reduce step. Any other reasons cause this problem? 2008/6/18 Konstantin Shvachko <[EMAIL PROTECTED]>: > Did you close those files? > If not they may be empty. > > > > ??? wrote: > >> Dears, >> >> I use hadoop-0.16.4 to do some work and found a error which i c

How to control the map and reduce step sequentially

2008-07-28 Thread
Dear All, When i using Hadoop, I noticed that the reducer step is started immediately when the mappers are still running. According to my project requirement, the reducer step should not start until all the mappers finish their execution. Anybody knows how to use some Hadoop API to achieve this? W

Re: How to control the map and reduce step sequentially

2008-07-28 Thread
I got it. Thanks! 2008/7/28 Shengkai Zhu <[EMAIL PROTECTED]> > The real reduce logic is actually started when all map tasks are finished. > > Is it still unexpected? > > > On 7/28/08, 晋光峰 <[EMAIL PROTECTED]> wrote: > > > > Dear All, > > > > W

How to run hadoop without DNS server?

2008-08-06 Thread
Dear all, While I configure and use the hadoop framework, it seems that the DNS server must be used to do hostname resolution (even if i configure the IP address but not hostname in config/slaves and config/masters file). Because we don't have local DNS server in our local ethernet, so i have to a

Some error happened while writing large size data in reduce step

2008-08-12 Thread
Dear all, I use hadoop and do map-reduce work in our project. While the program's data size is very large(we need to write about 100 GB data in reduce step), some reducers failed(but not all reducers) during the reduce step, which caused part of data failed to be written to HDFS. When i check th

Too many open files IOException while running hadoop mapper phase

2008-10-21 Thread
Dear all, I use Hadoop 0.18.0 to execute a job which will output huge key-value pairs in Mapper phase. While running the job in the mapper phase, the hadoop framework throws exception as below: java.io.FileNotFoundException: /home/guangfeng/bin/hadoop-0.18.0/tmp/hadoop-guangfeng/mapred/local/task

Re: Too many open files IOException while running hadoop mapper phase

2008-10-21 Thread
Can you give me a detailed explanation about how to dealing with this issue? I havn't found related archives of this list. Regards Guangfeng On Tue, Oct 21, 2008 at 6:19 PM, Karl Anderson <[EMAIL PROTECTED]> wrote: > > On 20-Oct-08, at 11:59 PM, 晋光峰 wrote: > > Dear

How to integrate hadoop framework with web application

2008-11-23 Thread
Dear all, Does anyone knows how to integrate hadoop to web applications? I want to startup a hadoop job by the Java Servlet (in web server servlet container), then get the result and send result back to browser. Is this possible? How to connect the web server with the hadoop framework? Please giv

Re: How to integrate hadoop framework with web application

2008-11-24 Thread
gt; > Alexander > > 2008/11/24 柳松 <[EMAIL PROTECTED]> > > > Dear 晋光峰: > >Glad to see another Chinese name here. It sounds possible, but could > you > > give us a little more detail? > > Best Regards. > > > > > > > > 在2008