Re: Bad connection to FS. command aborted.

2008-12-04 Thread elangovan anbalahan
Please tell me why am i getting this error.
it is becoming hard for me to find a solution

*put: java.io.IOException: failed to create file
/user/nutch/urls/urls/.urllist.txt.crc on client 127.0.0.1 because
target-length is 0, below MIN_REPLICATION (1)*

i am getting this when i do
bin/hadoop dfs -put urls urs


bash-3.2$ bin/start-all.sh
starting namenode, logging to
/nutch/search/logs/hadoop-nutch-namenode-elan.out
localhost: starting datanode, logging to
/nutch/search/logs/hadoop-nutch-datanode-elan.out
cat: /nutch/search/bin/../conf/masters: No such file or directory
starting jobtracker, logging to
/nutch/search/logs/hadoop-nutch-jobtracker-elan.out
localhost: starting tasktracker, logging to
/nutch/search/logs/hadoop-nutch-tasktracker-elan.out
bash-3.2$ mkdir urls
bash-3.2$ vi urls/urllist.txt
bash-3.2$ bin/hadoop dfs -put urls urls
put: java.io.IOException: failed to create file
/user/nutch/urls/.urllist.txt.crc on client 127.0.0.1 because target-length
is 0, below MIN_REPLICATION (1)
bash-3.2$ bin/hadoop dfs -put urls urls
put: java.io.IOException: failed to create file
/user/nutch/urls/urls/.urllist.txt.crc on client 127.0.0.1 because
target-length is 0, below MIN_REPLICATION (1)



On Thu, Dec 4, 2008 at 2:10 PM, elangovan anbalahan
<[EMAIL PROTECTED]>wrote:

> Hadoop 0.12.2
>
>
>
> On Thu, Dec 4, 2008 at 1:54 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:
>
>> hadoop version ?
>> command : bin/hadoop version
>>
>> -Sagar
>>
>>
>>
>> elangovan anbalahan wrote:
>>
>>> i tried that but nothing happened
>>>
>>> bash-3.2$ bin/hadoop dfs -put urll urll
>>> put: java.io.IOException: failed to create file
>>> /user/nutch/urll/.urls.crc
>>> on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION
>>> (1)
>>> bash-3.2$ bin/hadoop dfs -cat urls/part-0* > urls
>>> bash-3.2$ bin/hadoop dfs -ls urls
>>> Found 0 items
>>> bash-3.2$ bin/hadoop dfs -ls urll
>>> Found 0 items
>>> bash-3.2$ bin/hadoop dfs -ls
>>> Found 2 items
>>> /user/nutch/$
>>> /user/nutch/urll
>>>
>>>
>>> how do i get rid of the foll error:
>>> *put: java.io.IOException: failed to create file
>>> /user/nutch/urll/.urls.crc
>>> on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION
>>> (1)
>>>
>>>
>>> *
>>> On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
>>> <[EMAIL PROTECTED]>wrote:
>>>
>>>
>>>
 you didn't say what the error was?

 but you can try this it should do the same thing

 bin/hadoop dfs -cat urls/part-0* > urls


 elangovan anbalahan wrote:



> im getting this error message when i am dong
>
> *bash-3.2$ bin/hadoop dfs -put urls urls*
>
>
> please lemme know the resolution, i have a project submission in a few
> hours
>
>
>
>
>


>>>
>>>
>>>
>>
>>
>


Re: Bad connection to FS. command aborted.

2008-12-04 Thread elangovan anbalahan
Hadoop 0.12.2


On Thu, Dec 4, 2008 at 1:54 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:

> hadoop version ?
> command : bin/hadoop version
>
> -Sagar
>
>
>
> elangovan anbalahan wrote:
>
>> i tried that but nothing happened
>>
>> bash-3.2$ bin/hadoop dfs -put urll urll
>> put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
>> on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION
>> (1)
>> bash-3.2$ bin/hadoop dfs -cat urls/part-0* > urls
>> bash-3.2$ bin/hadoop dfs -ls urls
>> Found 0 items
>> bash-3.2$ bin/hadoop dfs -ls urll
>> Found 0 items
>> bash-3.2$ bin/hadoop dfs -ls
>> Found 2 items
>> /user/nutch/$
>> /user/nutch/urll
>>
>>
>> how do i get rid of the foll error:
>> *put: java.io.IOException: failed to create file
>> /user/nutch/urll/.urls.crc
>> on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION
>> (1)
>>
>>
>> *
>> On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
>> <[EMAIL PROTECTED]>wrote:
>>
>>
>>
>>> you didn't say what the error was?
>>>
>>> but you can try this it should do the same thing
>>>
>>> bin/hadoop dfs -cat urls/part-0* > urls
>>>
>>>
>>> elangovan anbalahan wrote:
>>>
>>>
>>>
 im getting this error message when i am dong

 *bash-3.2$ bin/hadoop dfs -put urls urls*


 please lemme know the resolution, i have a project submission in a few
 hours





>>>
>>>
>>
>>
>>
>
>


Re: Bad connection to FS. command aborted.

2008-12-04 Thread Sagar Naik

hadoop version ?
command : bin/hadoop version

-Sagar


elangovan anbalahan wrote:

i tried that but nothing happened

bash-3.2$ bin/hadoop dfs -put urll urll
put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION (1)
bash-3.2$ bin/hadoop dfs -cat urls/part-0* > urls
bash-3.2$ bin/hadoop dfs -ls urls
Found 0 items
bash-3.2$ bin/hadoop dfs -ls urll
Found 0 items
bash-3.2$ bin/hadoop dfs -ls
Found 2 items
/user/nutch/$
/user/nutch/urll


how do i get rid of the foll error:
*put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION (1)


*
On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
<[EMAIL PROTECTED]>wrote:

  

you didn't say what the error was?

but you can try this it should do the same thing

bin/hadoop dfs -cat urls/part-0* > urls


elangovan anbalahan wrote:



im getting this error message when i am dong

*bash-3.2$ bin/hadoop dfs -put urls urls*


please lemme know the resolution, i have a project submission in a few
hours



  



  




Re: Bad connection to FS. command aborted.

2008-12-04 Thread elangovan anbalahan
namenode is running.
but i did not understand what should i check in classpath ?

On Thu, Dec 4, 2008 at 1:34 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:

> Check u r conf in the classpath.
> Check if Namenode is running
> U r not able to connect to the intended Namenode
>
> -Sagar
>
> elangovan anbalahan wrote:
>
>> im getting this error message when i am dong
>>
>> *bash-3.2$ bin/hadoop dfs -put urls urls*
>>
>>
>> please lemme know the resolution, i have a project submission in a few
>> hours
>>
>>
>>
>
>


Re: Bad connection to FS. command aborted.

2008-12-04 Thread elangovan anbalahan
i tried that but nothing happened

bash-3.2$ bin/hadoop dfs -put urll urll
put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION (1)
bash-3.2$ bin/hadoop dfs -cat urls/part-0* > urls
bash-3.2$ bin/hadoop dfs -ls urls
Found 0 items
bash-3.2$ bin/hadoop dfs -ls urll
Found 0 items
bash-3.2$ bin/hadoop dfs -ls
Found 2 items
/user/nutch/$
/user/nutch/urll


how do i get rid of the foll error:
*put: java.io.IOException: failed to create file /user/nutch/urll/.urls.crc
on client 192.168.1.6 because target-length is 0, below MIN_REPLICATION (1)


*
On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
<[EMAIL PROTECTED]>wrote:

>
> you didn't say what the error was?
>
> but you can try this it should do the same thing
>
> bin/hadoop dfs -cat urls/part-0* > urls
>
>
> elangovan anbalahan wrote:
>
>> im getting this error message when i am dong
>>
>> *bash-3.2$ bin/hadoop dfs -put urls urls*
>>
>>
>> please lemme know the resolution, i have a project submission in a few
>> hours
>>
>>
>>
>
>


Re: Bad connection to FS. command aborted.

2008-12-04 Thread Sagar Naik

Check u r conf in the classpath.
Check if Namenode is running
U r not able to connect to the intended Namenode

-Sagar
elangovan anbalahan wrote:

im getting this error message when i am dong

*bash-3.2$ bin/hadoop dfs -put urls urls*


please lemme know the resolution, i have a project submission in a few hours

  




Re: Bad connection to FS. command aborted.

2008-12-04 Thread Elia Mazzawi


you didn't say what the error was?

but you can try this it should do the same thing

bin/hadoop dfs -cat urls/part-0* > urls

elangovan anbalahan wrote:

im getting this error message when i am dong

*bash-3.2$ bin/hadoop dfs -put urls urls*


please lemme know the resolution, i have a project submission in a few hours