Glad to hear that :)

Warm Regards,
Tariq
cloudfront.blogspot.com


On Fri, Jul 19, 2013 at 1:10 PM, Anit Alexander <anitama...@gmail.com>wrote:

> Hello Tariq,
> I solved the problem. There must have been some problem in the custom
> input format i created. so i took a sample custom input format which was
> working in cdh4 environment and applied the changes as per my requirement.
> It is working now. But i havent tested that code in apache hadoop
> environment yet :)
>
> Regards,
> Anit
>
>
> On Thu, Jul 18, 2013 at 1:22 AM, Mohammad Tariq <donta...@gmail.com>wrote:
>
>> Hello Anit,
>>
>> Could you show me the exact error log?
>>
>> Warm Regards,
>> Tariq
>> cloudfront.blogspot.com
>>
>>
>> On Tue, Jul 16, 2013 at 8:45 AM, Anit Alexander <anitama...@gmail.com>wrote:
>>
>>> yes i did recompile. But i seem to face the same problem. I am running
>>> the map reduce with a custom input format. I am not sure if there is some
>>> change in the API to get the splits correct.
>>>
>>> Regards
>>>
>>>
>>> On Tue, Jul 16, 2013 at 6:24 AM, 闫昆 <yankunhad...@gmail.com> wrote:
>>>
>>>> I think you should recompile the program after run the program
>>>>
>>>>
>>>> 2013/7/13 Anit Alexander <anitama...@gmail.com>
>>>>
>>>>> Hello,
>>>>>
>>>>> I am encountering a problem in cdh4 environment.
>>>>> I can successfully run the map reduce job in the hadoop cluster. But
>>>>> when i migrated the same map reduce to my cdh4 environment it creates an
>>>>> error stating that it cannot read the next block(each block is 64 mb). Why
>>>>> is that so?
>>>>>
>>>>> Hadoop environment: hadoop 1.0.3
>>>>> java version 1.6
>>>>>
>>>>> chd4 environment: CDH4.2.0
>>>>> java version 1.6
>>>>>
>>>>> Regards,
>>>>> Anit Alexander
>>>>>
>>>>
>>>>
>>>
>>
>
  • [no subject] Anit Alexander
    • Re: Suresh Srinivas
    • Re: 闫昆
      • Re: Anit Alexander
        • Re: Mohammad Tariq
          • Re: Anit Alexander
            • Re: Mohammad Tariq

Reply via email to