Re: Can I write to an compressed file which is located in hdfs?

2012-02-06 Thread David Sinclair
Hi, You may want to have a look at the Flume project from Cloudera. I use it for writing data into HDFS. https://ccp.cloudera.com/display/SUPPORT/Downloads dave 2012/2/6 Xiaobin She > hi Bejoy , > > thank you for your reply. > > actually I have set up an test cluster which has one namenode/jo

Re: Error: INFO ipc.Client: Retrying connect to server: /192.168.100.11:8020. Already tried 0 time(s).

2011-04-19 Thread David Sinclair
Can you give a little more details about your problem and hadoop setup? Is this a single machine running NN, DN, etc., or is this a cluster? On Tue, Apr 19, 2011 at 9:42 AM, endho wrote: > > hi , > Am having the same problem with the bin/hadoop dfs command. And I havent > figured out the proble

Re: custom writable classes

2011-02-02 Thread David Sinclair
ethods .. if i have a bunch of strings in my writable then what > should be the read method implementation .. > > I really appreciate the help from all you guys .. > > On Wed, Feb 2, 2011 at 12:52 PM, David Sinclair < > dsincl...@chariotsolutions.com> wrote: > > > So

Re: custom writable classes

2011-02-02 Thread David Sinclair
.. instead of sending a > bunch of delimited text I want to send an actual object to my reducer > > On Wed, Feb 2, 2011 at 12:33 PM, David Sinclair < > dsincl...@chariotsolutions.com> wrote: > > > Are you storing your data as text or binary? > > > >

Re: custom writable classes

2011-02-02 Thread David Sinclair
Are you storing your data as text or binary? If you are storing as text, your mapper is going to get Keys of type LongWritable and values of type Text. Inside your mapper you would parse out the strings and wouldn't be using your custom writable; that is unless you wanted your mapper/reducer to pr

Re: Losing Records with Block Compressed Sequence File

2011-01-21 Thread David Sinclair
d it turned out I was neglecting to close/flush my output. > > > On 01/21/2011 01:04 PM, David Sinclair wrote: > >> Hi, I am seeing an odd problem when writing block compressed sequence >> files. >> If I write 400,000 records into a sequence file w/o compression, all 40

Losing Records with Block Compressed Sequence File

2011-01-21 Thread David Sinclair
Hi, I am seeing an odd problem when writing block compressed sequence files. If I write 400,000 records into a sequence file w/o compression, all 400K end up in the file. If I write with block, regardless if it is bz2 or deflate, I start losing records. Not a ton, but a couple hundred. Here are th