I am not able to perform distributed search with two machines having indexed
data.
I have crawled data on two machines, one in the server itself and the other
on another linux laptop.
These are the changes i have made :
1)/etc/hosts/
# /etc/hosts (for master AND slave)
192.168.1.106master
192
MIN_REPLICATION (1)
On Thu, Dec 4, 2008 at 2:10 PM, elangovan anbalahan
<[EMAIL PROTECTED]>wrote:
> Hadoop 0.12.2
>
>
>
> On Thu, Dec 4, 2008 at 1:54 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:
>
>> hadoop version ?
>> command : bin/hadoop version
>>
>>
Hadoop 0.12.2
On Thu, Dec 4, 2008 at 1:54 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:
> hadoop version ?
> command : bin/hadoop version
>
> -Sagar
>
>
>
> elangovan anbalahan wrote:
>
>> i tried that but nothing happened
>>
>> bash-3.2$ bin/ha
namenode is running.
but i did not understand what should i check in classpath ?
On Thu, Dec 4, 2008 at 1:34 PM, Sagar Naik <[EMAIL PROTECTED]> wrote:
> Check u r conf in the classpath.
> Check if Namenode is running
> U r not able to connect to the intended Namenode
>
>
use target-length is 0, below MIN_REPLICATION (1)
*
On Thu, Dec 4, 2008 at 1:29 PM, Elia Mazzawi
<[EMAIL PROTECTED]>wrote:
>
> you didn't say what the error was?
>
> but you can try this it should do the same thing
>
> bin/hadoop dfs -cat urls/part-0* > urls
>
im getting this error message when i am dong
*bash-3.2$ bin/hadoop dfs -put urls urls*
please lemme know the resolution, i have a project submission in a few hours