Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
Hi, When loading a file from local to HDFS cluster using the below command hadoop fs -put sales.txt /sales_dept. Getting the following exception.Please let me know how to resolve this issue asap.Please find the attached is the logs that is displaying on namenode. Regards, Sandeep.v hadoop26@opt

Re: Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
Sorry Liaw,I tried same command but its didn't resolve. Regards, Sandeep.V On Wed, Apr 8, 2015 at 11:37 PM, Liaw, Huat (MTO) wrote: > Should be hadoop dfs -put > > > > *From:* sandeep vura [mailto:sandeepv...@gmail.com] > *Sent:* April 8, 2015 1:53 PM > *To:* user@ha

Re: Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
e out the issue exactly.Is issue relates to network or Hadoop configuration. On Wed, Apr 8, 2015 at 11:54 PM, Liaw, Huat (MTO) wrote: > hadoop fs -put Copy from remote location to HDFS > > > > *From:* sandeep vura [mailto:sandeepv...@gmail.com] > *Sent:* April 8, 20

Re: Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
We are using this setup from a very long time.We are able to run all the jobs successfully but suddenly went wrong with namenode. On Thu, Apr 9, 2015 at 12:06 AM, sandeep vura wrote: > I have also noticed another issue when starting hadoop cluster > start-all.sh command > >

Re: Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
: > You can not start 192.168.2.84:50010…. closed by ((192.168.2.x > -datanode)) > > > > *From:* sandeep vura [mailto:sandeepv...@gmail.com] > *Sent:* April 8, 2015 2:39 PM > > *To:* user@hadoop.apache.org > *Subject:* Re: Unable to load file from local to HDFS cluster &

Re: Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
Our issue has been resolved. Root cause: Network related issue. Thanks for each and everyone spent sometime and replied to my questions. Regards, Sandeep.v On Thu, Apr 9, 2015 at 10:45 AM, sandeep vura wrote: > Can anyone give solution for my issue? > > On Thu, Apr 9, 2015 at

Re: Unable to load file from local to HDFS cluster

2015-04-09 Thread sandeep vura
30 PM, 杨浩 wrote: > Root cause: Network related issue? > can you tell us more detailedly? Thank you > > 2015-04-09 13:51 GMT+08:00 sandeep vura : > >> Our issue has been resolved. >> >> Root cause: Network related issue. >> >> Thanks for each and e

Re: Unable to load file from local to HDFS cluster

2015-04-08 Thread sandeep vura
Can anyone give solution for my issue? On Thu, Apr 9, 2015 at 12:48 AM, sandeep vura wrote: > Exactly but every time it picks randomly. Our datanodes are > 192.168.2.81,192.168.2.82,192.168.2.83,192.168.2.84,192.168.2.85 > > Namenode : 192.168.2.80 > > If i restarts the cl

Re: Unable to load file from local to HDFS cluster

2015-04-13 Thread sandeep vura
Its not conflicted our network team as changed settings in Core Switch of VLAN On Sun, Apr 12, 2015 at 8:26 AM, 杨浩 wrote: > Oh, I see. Is that you have configured a conflicted port before? > > 2015-04-09 18:36 GMT+08:00 sandeep vura : > >> Hi Yanghaogn, >> >> Sure

Re: Question on configuring Hadoop 2.6.0 with a different filesystem

2015-04-16 Thread sandeep vura
Hi Silvan, Please put the below configuration in core-site.xml and start the cluster. fs.default.name quobyte:// prod.corp.quobyte.com:7861/users/kaisers/hadoop-test/ fs.quobyte.impl com.quobyte.hadoop.QuobyteFileSystem Regards, Sandeep.v On Thu, Apr 16, 2015 at 4:23 PM, Silv

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
Hi Anand, comment the ip address - 127.0.1.1 in /etc/hosts add the following ip address - 127.0.0.1 localhost in /etc/hosts. Restart your hadoop cluster after made changes in /etc/hosts Regards, Sandeep.v On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali wrote: > Dear All: > > Has anyone encoun

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
hosts file will be available in /etc directory please check once. On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali wrote: > I don't seem to have etc/host > > > Sent from my iPhone > > On 22-Apr-2015, at 2:30 pm, sandeep vura wrote: > > Hi Anand, > > comment

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
c/hadoop$ > > Thanks. > > Regards, > > > > Anand Murali > 11/7, 'Anand Vihar', Kandasamy St, Mylapore > Chennai - 600 004, India > Ph: (044)- 28474593/ 43526162 (voicemail) > > > > On Wednesday, April 22, 2015 2:41 PM, Anand Murali < > anand_

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
dasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> >> >> >> On Wednesday, April 22, 2015 4:43 PM, sandeep vura < >> sandeepv...@gmail.com> wrote: >> >> >> Hi Anand, >> >> You should s

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
wrote: > Sudo what my friend. There are so many options to sudo > > Sent from my iPhone > > On 23-Apr-2015, at 8:20 am, sandeep vura wrote: > > Ananad, > > Try sudo it will work > > On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus > wrote: > >> Can yo

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-23 Thread sandeep vura
Ph: (044)- 28474593/ 43526162 (voicemail) > > > > On Thursday, April 23, 2015 11:22 AM, Anand Murali < > anand_vi...@yahoo.com> wrote: > > > Many thanks my friend. Shall try it right away. > > Anand Murali > 11/7, 'Anand Vihar', Kandasamy St, M

Re: Apache Hadoop tests fail with UnknownHostException

2015-06-05 Thread sandeep vura
You can try the following steps as mentioned below Step-1 Go to /etc/hosts Step-2 Edit the "hosts" file with IP 127.0.0.1 [space/tab] localhost [space/tab] HostName (e.g. static.98.35.ebonenet.com) Step-3 Save the file and try again On Sat, Jun 6, 2015 at 2:56 AM, rongzheng yan wrote: > Hell

Re: how to quickly fs -cp dir with thousand files?

2016-01-10 Thread sandeep vura
Hi Chris, Instead of copying files . Use mv command . - hadoop fs -mv /user/hadoop/file1 /user/hadoop/file2 Sandeep.v On Sat, Jan 9, 2016 at 9:55 AM, Chris Nauroth wrote: > DistCp is capable of running large copies like this in distributed > fashion, implemented as a MapReduce job. > >

Re: CapacityScheduler vs. FairScheduler

2016-06-19 Thread sandeep vura
Hi, I too have same doubt !! Please clarify. Regards, sandeep.v On Fri, Jun 10, 2016 at 6:08 AM, Alvin Chyan wrote: > I have the same question. > > Thanks! > > > *Alvin Chyan*Lead Software Engineer, Data > 901 Marshall St, Suite 200, Redwood City, CA 94063 > > > turn.com

Subcribe

2016-07-17 Thread sandeep vura
Hi Team, please add my email id in subscribe list. Regards, Sandeep.v

Standby Namenode getting RPC latency alerts

2016-07-17 Thread sandeep vura
Hi Team, We are getting rpc latency alerts from the standby namenode. What does it means? Where to check the logs for the root cause? I have already checked standby namenode logs but didn't find any specific error. Regards, Sandeep.v

Re: telegraf plugin

2016-08-12 Thread sandeep vura
Hi Mohan, Did you check this link https://github.com/influxdata/telegraf Hope this may helps you. Regards, Sandeep.v On Fri, Aug 12, 2016 at 12:41 AM, rammohan ganapavarapu < rammohanga...@gmail.com> wrote: > Hi, > > Any one have telegraf (influxdb) plugin for hadoop components to collect > m

Pause duration alert on Name Node

2016-09-13 Thread sandeep vura
Hi Hadoop experts, We are getting continuous Pause duration alert from Standby NN. Current configuration for Java heap size of NN : 7.8 GB When we checked NN webUI heap memory used 7.5 GB of 7.8 GB heap memory. Please suggest how to fix this. Thanks sandeep

How to clear block count alert on hdfs

2016-09-16 Thread sandeep vura
Hi hadoop experts, We are getting block count alerts on datanodes. Please find the DFS admin report Configured Capacity: 58418139463680 (53.13 TB) Present Capacity: 55931103011017 (50.87 TB) DFS Remaining: 55237802565632 (50.24 TB) DFS Used: 693300445385 (645.69 GB) DFS Used%: 1.24% Under r