Re: Regarding loading a big XML file to HDFS

2011-11-22 Thread Joey Echeverria
If your file is bigger than a block size (typically 64mb or 128mb), then it will be split into more than one block. The blocks may or may not be stored on different datanodes. If you're using a default InputFormat, then the input will be split between two task. Since you said you need the whole

Re: Regarding loading a big XML file to HDFS

2011-11-22 Thread Mridul Muralidharan
Date: Tue, 22 Nov 2011 03:08:20 + From: mahesw...@huawei.com Subject: RE: Regarding loading a big XML file to HDFS To: common-user@hadoop.apache.org; core-u...@hadoop.apache.org Also i am surprising, how you are writing mapreduce application here. Map and reduce will work with key value pairs

RE: Regarding loading a big XML file to HDFS

2011-11-21 Thread Uma Maheswara Rao G
: RE: Regarding loading a big XML file to HDFS __ From: hari708 [hari...@gmail.com] Sent: Tuesday, November 22, 2011 6:50 AM To: core-u...@hadoop.apache.org Subject: Regarding loading a big XML file to HDFS Hi, I have a big file consisting of XML data.the XML

RE: Regarding loading a big XML file to HDFS

2011-11-21 Thread Michael Segel
that make sense? -Mike Date: Tue, 22 Nov 2011 03:08:20 + From: mahesw...@huawei.com Subject: RE: Regarding loading a big XML file to HDFS To: common-user@hadoop.apache.org; core-u...@hadoop.apache.org Also i am surprising, how you are writing mapreduce application here. Map and reduce

Re: Regarding loading a big XML file to HDFS

2011-11-21 Thread Bejoy Ks
: RE: Regarding loading a big XML file to HDFS To: common-user@hadoop.apache.org; core-u...@hadoop.apache.org Also i am surprising, how you are writing mapreduce application here. Map and reduce will work with key value pairs. From: Uma