I have also noticed another issue when starting hadoop cluster start-all.sh
command

namenode and datanode daemons are starting.But sometimes one of the
datanode would drop the connection and it shows the message connection
closed by ((192.168.2.x -datanode)) everytime when it restart the hadoop
cluster datanode will keeps changing .

for example 1st time when i starts hadoop cluster - 192.168.2.1 -
connection closed
2nd time when i starts hadoop cluster - 192.168.2.2-connection closed .This
point again 192.168.2.1 will starts successfuly without any errors.

I couldn't able to figure out the issue exactly.Is issue relates to network
or Hadoop configuration.



On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) <huat.l...@ontario.ca>
wrote:

>  hadoop fs -put <source> <destination> Copy from remote location to HDFS
>
>
>
> *From:* sandeep vura [mailto:sandeepv...@gmail.com]
> *Sent:* April 8, 2015 2:24 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Unable to load file from local to HDFS cluster
>
>
>
> Sorry Liaw,I tried same command but its didn't resolve.
>
>
>
> Regards,
>
> Sandeep.V
>
>
>
> On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) <huat.l...@ontario.ca>
> wrote:
>
> Should be hadoop dfs -put
>
>
>
> *From:* sandeep vura [mailto:sandeepv...@gmail.com]
> *Sent:* April 8, 2015 1:53 PM
> *To:* user@hadoop.apache.org
> *Subject:* Unable to load file from local to HDFS cluster
>
>
>
> Hi,
>
>
>
> When loading a file from local to HDFS cluster using the below command
>
>
>
> hadoop fs -put sales.txt /sales_dept.
>
>
>
> Getting the following exception.Please let me know how to resolve this
> issue asap.Please find the attached is the logs that is displaying on
> namenode.
>
>
>
> Regards,
>
> Sandeep.v
>
>
>

Reply via email to