Re: decommissioning node woes

2011-03-19 Thread Ted Dunning
Unfortunately this doesn't help much because it is hard to get the ports to balance the load. On Fri, Mar 18, 2011 at 8:30 PM, Michael Segel michael_se...@hotmail.comwrote: With a 1GBe port, you could go 100Mbs for the bandwidth limit. If you bond your ports, you could go higher.

File formats in Hadoop

2011-03-19 Thread Weishung Chung
I am browsing through the hadoop.io package and was wondering what other file formats are available in hadoop other than SequenceFile and TFile? Is all data written through hadoop including those from hbase saved in the above formats? It seems like SequenceFile is in key value pair format. Thank

Re: File formats in Hadoop

2011-03-19 Thread Harsh J
Hello, On Sat, Mar 19, 2011 at 9:31 PM, Weishung Chung weish...@gmail.com wrote: I am browsing through the hadoop.io package and was wondering what other file formats are available in hadoop other than SequenceFile and TFile? Additionally, on Hadoop, there're MapFiles/SetFiles (Derivative of

can't find lzo headers when ant compile hadoop package

2011-03-19 Thread Shi Yu
Trying to install LZO and compile the hadoop package following the instructions at http://sudhirvn.blogspot.com/2010/07/installing-hadoop-native-libraries.html I don't have root privilege thus no sudo, no rpm installation is possible. So I built and installed LZO source in my home folder. The

RE: decommissioning node woes

2011-03-19 Thread Michael Segel
Usually the port bonding is done at a lower level so that you and your applications see this as a single port. So you don't have to worry about load balancing between the ports. (Or am I missing something?) thx -Mike From: tdunn...@maprtech.com Date: Sat, 19 Mar 2011 09:00:30 -0700

Re: WritableName can't load class ... for custom WritableClasses

2011-03-19 Thread Simon
It is hard to judge without the code. But my guess is that your TermFreqArrayWritable is not properly compiled or imported into your job control file. HTH. Simon On Fri, Mar 18, 2011 at 7:23 PM, maha m...@umail.ucsb.edu wrote: Hi, The following was working fine with Hadoop Writables. Now,

Re: running local hadoop job in windows

2011-03-19 Thread Simon
As far as I know, currently hadoop can only run under *nix like systems. Correct me if I am wrong. And if you want to run it under windows, you can try cygwin as the environment. Thanks Simon On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner markkerz...@gmail.com wrote: No, I hoped that it is not

Re: decommissioning node woes

2011-03-19 Thread M. C. Srivas
All trunking/bonding at the switch (eg, LACP) gives only 1 NIC's worth of bandwidth point-to-point, even if your boxes all have multiple NICs. It chooses a NIC at connection initiation (via round-robin, or load, or whatever). But once the TCP connection is established, there is no load-balancing

Re: running local hadoop job in windows

2011-03-19 Thread Mark Kerzner
Now I AM running under cygwin, and I get the same error, as you can see from the attached screenshot. Thank you, Mark On Sat, Mar 19, 2011 at 9:16 PM, Simon gsmst...@gmail.com wrote: As far as I know, currently hadoop can only run under *nix like systems. Correct me if I am wrong. And if you

Re: WritableName can't load class ... for custom WritableClasses

2011-03-19 Thread maha
That's absolutely correct :) thanks Simon. Maha On Mar 19, 2011, at 7:13 PM, Simon wrote: It is hard to judge without the code. But my guess is that your TermFreqArrayWritable is not properly compiled or imported into your job control file. HTH. Simon On Fri, Mar 18, 2011 at 7:23 PM,