Mike,

Thanks for your response. I am able to fix it. The problem I figured out was
same as I assumed in my last email i.e. directory access issue was creating
problems.

Cheers,
Ahmad

On Thu, Nov 12, 2009 at 1:03 PM, Ahmad Ali Iqbal
<[email protected]>wrote:

> Thank you mike,
>
> Infact running any sample code gives the same error. Yes, I already
> realized that output folder MUST be deleted before running it and I am doing
> it in every run. As far as concerned with the tmp directory, I issue the
> following command and get the output as follows;
>
> ah...@aai:/usr/local/share/hadoop-0.20.1$ *bin/hadoop dfs -ls*
> Found 1 items
> drwxr-xr-x   - ahmad supergroup          0 2009-11-12 12:50
> /user/ahmad/input
> ah...@aai:/usr/local/share/hadoop-0.20.1$
>
> Also I can see a /tmp directory and subdirectories in it. Please let me
> know, if I am not checking it correctly. To my understanding, it tries to
> write some log files and is unable to get those files at
> /home/ahmad/hadoop-dev/logs/
> userlogs/attempt_200911111450_0001_m_000004_0/
>
> Is their any parameter, that I can fix or any directory access issue?
>
> Thanks,
>
> --
> Ahmad
>
> On Thu, Nov 12, 2009 at 12:40 PM, Mike Kendall <[email protected]> wrote:
>
>> My first guess is that your tmp directory isn't set up correctly.  Also, I
>> don't know about WordCountv2 but the original wordcount needed an output
>> directory passed along with an input directory (the directory has to not
>> exist when you start the job).
>>
>> -mike
>>
>>

Reply via email to