Re: hadoop input buffer size

2011-10-05 Thread Yang Xiaoliang
Hi, Hadoop neither read one line each time, nor fetching dfs.block.size of lines into a buffer, Actually, for the TextInputFormat, it read io.file.buffer.size bytes of text into a buffer each time, this can be seen from the hadoop source file LineReader.java 2011/10/5 Mark question > Hello, >

Re: Run hadoop Map/Reduce app from another machine

2011-10-05 Thread Yang Xiaoliang
Install hadoop on your local machine, copy the configuration files from the remote hadoop culuster server to your local machine(including the hosts file), then you can just submit a *.jar locally as before. 2011/10/5 oleksiy > > Hello, > > I'm trying to find a way how to run hadoop MapReduce ap

TestDFSIO error: libhdfs.so.1 does not exist

2011-07-28 Thread Yang Xiaoliang
Hi all, I am benchmarking a Hadoop Cluster with the hadoop-*-test.jar TestDFSIO but the following error returns: File /usr/hadoop-0.20.2/libhdfs/libhdfs.so.1 does not exist. How to solve this problem? Thanks!

Re: Current available Memory

2011-02-24 Thread Yang Xiaoliang
Thanks a lot! Yang Xiaoliang 2011/2/25 maha > Hi Yang, > > The problem could be solved using the following link: > http://www.roseindia.net/java/java-get-example/get-memory-usage.shtml > You need to use other memory managers like the Garbage collector and its > finalize

Re: Current available Memory

2011-02-23 Thread Yang Xiaoliang
I had also encuntered the smae problem a few days ago. any one has another method? 2011/2/24 maha > Based on the Java function documentation, it gives approximately the > available memory, so I need to tweak it with other functions. > So it's a Java issue not Hadoop. > > Thanks anyways, > Maha