Re: 1 file per record

2009-03-16 Thread Sean Arietta
rmats > (TextInputFormat and SequenceFileInputFormat) basically divide input > files by blocks, unless the requested number of mappers is really high. > > -- Owen > > -- View this message in context: http://www.nabble.com/1-file-per-record-tp19644985p22551968.html Sent f

Re: 1 file per record

2008-10-03 Thread chandravadana
iles by blocks, unless the requested number of mappers is really high. > > -- Owen > > -- View this message in context: http://www.nabble.com/1-file-per-record-tp19644985p19794194.html Sent from the Hadoop core-user mailing list archive at Nabble.com.

Re: 1 file per record

2008-10-02 Thread Owen O'Malley
On Oct 2, 2008, at 1:50 AM, chandravadana wrote: If we dont specify numSplits in getsplits(), then what is the default number of splits taken... The getSplits() is either library or user code, so it depends which class you are using as your InputFormat. The FileInputFormats (TextInputForma

Re: 1 file per record

2008-10-02 Thread chandravadana
hi all... i have doubt.. If we dont specify numSplits in getsplits(), then what is the default number of splits taken... -- Best Regards S.Chandravadana -- View this message in context: http://www.nabble.com/1-file-per-record-tp19644985p19775580.html Sent from the Hadoop core-user

RE: 1 file per record

2008-09-26 Thread Goel, Ankur
records. I suggest you override an exisiting RecordReader implementation or create your own to fit your case. -Original Message- From: chandravadana [mailto:[EMAIL PROTECTED] Sent: Friday, September 26, 2008 3:04 PM To: core-user@hadoop.apache.org Subject: Re: 1 file per record hi

Re: 1 file per record

2008-09-26 Thread chandravadana
> Any unauthorized review, use, disclosure, dissemination, forwarding, >> printing or copying of this email or any action taken in reliance on this >> e-mail is strictly >> prohibited and may be unlawful. >> >> > > > -- View this message in context: http://www.nabble.com/1-file-per-record-tp19644985p19685269.html Sent from the Hadoop core-user mailing list archive at Nabble.com.

Re: 1 file per record

2008-09-25 Thread chandravadana
> If you are not the intended recipient, please contact the sender by >>>> reply >>>> e-mail and destroy all copies of the original message. >>>> Any unauthorized review, use, disclosure, dissemination, forwarding, >>>> printing or copying of this email or any action taken in reliance on >>>> this >>>> e-mail is strictly >>>> prohibited and may be unlawful. >>>> >>>> >>>> >>> >>> >> >> > > -- View this message in context: http://www.nabble.com/1-file-per-record-tp19644985p19667750.html Sent from the Hadoop core-user mailing list archive at Nabble.com.

Re: 1 file per record

2008-09-24 Thread Enis Soztutar
Nope, not right now. But this has came up before. Perhaps you will contribute one? chandravadana wrote: thanks is there any built in record reader which performs this function.. Enis Soztutar wrote: Yes, you can use MultiFileInputFormat. You can extend the MultiFileInputFormat to retu

Re: 1 file per record

2008-09-24 Thread chandravadana
stroy all copies of the original message. >> Any unauthorized review, use, disclosure, dissemination, forwarding, >> printing or copying of this email or any action taken in reliance on this >> e-mail is strictly >> prohibited and may be unlawful. >> >> > > > -- View this message in context: http://www.nabble.com/1-file-per-record-tp19644985p19646442.html Sent from the Hadoop core-user mailing list archive at Nabble.com.

Re: 1 file per record

2008-09-24 Thread Enis Soztutar
Yes, you can use MultiFileInputFormat. You can extend the MultiFileInputFormat to return a RecordReader, which reads a record for each file in the MultiFileSplit. Enis chandra wrote: hi.. By setting isSplitable false, we can set 1 file with n records 1 mapper. Is there any way to set 1 com

1 file per record

2008-09-24 Thread chandra
hi.. By setting isSplitable false, we can set 1 file with n records 1 mapper. Is there any way to set 1 complete file per record.. Thanks in advance Chandravadana S This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential