I got the problem with I am unable to solve it. I need to apply a filter
for _SUCCESS file while using FileSystem.listStatus method. Can someone
please guide me how to filter _SUCCESS files. Thanks

On Tue, May 29, 2012 at 1:42 PM, waqas latif <waqas...@gmail.com> wrote:

> So my question is that do hadoop 0.20 and 1.0.3 differ in their support of
> writing or reading sequencefiles? same code works fine with hadoop 0.20 but
> problem occurs when run it under hadoop 1.0.3.
>
>
> On Sun, May 27, 2012 at 6:15 PM, waqas latif <waqas...@gmail.com> wrote:
>
>> But the thing is, it works with hadoop 0.20. even with 100 x100(and even
>> bigger matrices)  but when it comes to hadoop 1.0.3 then even there is a
>> problem with 3x3 matrix.
>>
>>
>> On Sun, May 27, 2012 at 12:00 PM, Prashant Kommireddi <
>> prash1...@gmail.com> wrote:
>>
>>> I have seen this issue with large file writes using SequenceFile writer.
>>> Not found the same issue when testing with writing fairly small files ( <
>>> 1GB).
>>>
>>> On Fri, May 25, 2012 at 10:33 PM, Kasi Subrahmanyam
>>> <kasisubbu...@gmail.com>wrote:
>>>
>>> > Hi,
>>> > If you are using a custom writable object while passing data from the
>>> > mapper to the reducer make sure that the read fields and the write has
>>> the
>>> > same number of variables. It might be possible that you wrote datavtova
>>> > file using custom writable but later modified the custom writable (like
>>> > adding new attribute to the writable) which the old data doesn't have.
>>> >
>>> > It might be a possibility is please check once
>>> >
>>> > On Friday, May 25, 2012, waqas latif wrote:
>>> >
>>> > > Hi Experts,
>>> > >
>>> > > I am fairly new to hadoop MapR and I was trying to run a matrix
>>> > > multiplication example presented by Mr. Norstadt under following link
>>> > > http://www.norstad.org/matrix-multiply/index.html. I can run it
>>> > > successfully with hadoop 0.20.2 but I tried to run it with hadoop
>>> 1.0.3
>>> > but
>>> > > I am getting following error. Is it the problem with my hadoop
>>> > > configuration or it is compatibility problem in the code which was
>>> > written
>>> > > in hadoop 0.20 by author.Also please guide me that how can I fix this
>>> > error
>>> > > in either case. Here is the error I am getting.
>>> > >
>>> > > in thread "main" java.io.EOFException
>>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:180)
>>> > >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
>>> > >        at
>>> > > org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
>>> > >        at
>>> > >
>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
>>> > >        at
>>> > >
>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
>>> > >        at
>>> > >
>>> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
>>> > >        at TestMatrixMultiply.fillMatrix(TestMatrixMultiply.java:60)
>>> > >        at TestMatrixMultiply.readMatrix(TestMatrixMultiply.java:87)
>>> > >        at TestMatrixMultiply.checkAnswer(TestMatrixMultiply.java:112)
>>> > >        at TestMatrixMultiply.runOneTest(TestMatrixMultiply.java:150)
>>> > >        at TestMatrixMultiply.testRandom(TestMatrixMultiply.java:278)
>>> > >        at TestMatrixMultiply.main(TestMatrixMultiply.java:308)
>>> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> > >        at
>>> > >
>>> > >
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> > >        at
>>> > >
>>> > >
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> > >        at java.lang.reflect.Method.invoke(Method.java:597)
>>> > >        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>> > >
>>> > > Thanks in advance
>>> > >
>>> > > Regards,
>>> > > waqas
>>> > >
>>> >
>>>
>>
>>
>

Reply via email to