Re: Unable to Find S3N Filesystem Hadoop 2.6

2015-04-22 Thread Billy Watson
Chris and Sato, Thanks a bunch! I've been so swamped by these and other issues we've been having in scrambling to upgrade our cluster that I forgot to file a bug. I certainly complained aloud that the docs were insufficient, but I didn't do anything to help the community so thanks a bunch for

Re: Unable to Find S3N Filesystem Hadoop 2.6

2015-04-22 Thread Chris Nauroth
Hello Billy, I think your experience indicates that our documentation is insufficient for discussing how to configure and use the alternative file systems. I filed issue HADOOP-11863 to track a documentation enhancement. https://issues.apache.org/jira/browse/HADOOP-11863 Please feel free to

RE: rolling upgrade(2.4.1 to 2.6.0) problem

2015-04-22 Thread Han-Cheol Cho
Hi, The first warning shows out-of-memory error of JVM. Did you give enough max heap memory for DataNode daemons? DN daemons, by default, uses max heap size 1GB. So if your DN requires more than that, it will be in a trouble. You can check the memory consumption of you DN dameons (e.g., top

Downgrade without downtime is not possible?

2015-04-22 Thread 조주일
http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-hdfs/HdfsRollingUpgrade.html Rolling Upgrade, 2.4.1 to 2.6.0 Downgrade without downtime, 2.6.0 to 2.4.1 Downgrade without downtime is not possible? org.apache.hadoop.hdfs.server.datanode.DataNode: Reported NameNode version

Re: Unable to Find S3N Filesystem Hadoop 2.6

2015-04-22 Thread Chris Nauroth
I agree with Sato's statement that the service loader mechanism should be able to find the S3N file system classes via the service loader metadata embedded in hadoop-aws.jar. I expect setting fs.s3n.impl wouldn't be required. Billy, if you find otherwise in your testing, please let us know.

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
Sudo what my friend. There are so many options to sudo Sent from my iPhone On 23-Apr-2015, at 8:20 am, sandeep vura sandeepv...@gmail.com wrote: Ananad, Try sudo it will work On Wed, Apr 22, 2015 at 5:58 PM, Shahab Yunus shahab.yu...@gmail.com wrote: Can you try sudo?

Sqoop2 error when I run the jobs through hue.

2015-04-22 Thread Kumar Jayapal
Hi, I am getting this error when I execute run the job in sqoop2 from hue. I see lots of people talking about this error but no proper resolution. Did any one able to resolve this issue. Any help is appreciated. 2015-04-22 21:36:07,281 ERROR

Re: Is there any way to limit the concurrent running mappers per job?

2015-04-22 Thread Harsh J
This has been introduced as a 2.7.0 feature, see MAPREDUCE-5583. On Tue, Apr 21, 2015 at 4:32 AM, Zhe Li allenlee...@gmail.com wrote: Hi, after upgraded to Hadoop 2 (yarn), I found that 'mapred.jobtracker.taskScheduler.maxRunningTasksPerJob' no longer worked, right? One workaround is to use

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
run this command in the terminal from root directory $ sudo nano /etc/hosts (( It will prompt to enter root password)) Later you can comment those lines in hosts files #127.0.1.1 add this line 127.0.0.1 localhost save the host file and exit On Thu, Apr 23, 2015 at 8:39 AM, Anand Murali

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
Many thanks my friend. Shall try it right away.  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) On Thursday, April 23, 2015 10:51 AM, sandeep vura sandeepv...@gmail.com wrote: run this command in the

Re: Unable to Find S3N Filesystem Hadoop 2.6

2015-04-22 Thread Takenori Sato
Hi Billy, Chris, Let me share a couple of my findings. I believe this was introduced by HADOOP-10893, which was introduced from 2.6.0(HDP2.2). 1. fs.s3n.impl We added a property to the core-site.xml file: You don't need to explicitly set this. It has never been done so in previous versions.

Problem with Hadoop 2.6 native build Windows

2015-04-22 Thread yves callaert
Hi All,I am currently trying to build Hadoop 2.6 for windows from the source code but I encountered a problem in the libwinutils.c class.The problem is with the following line of code:const WCHAR* wsceConfigRelativePath = WIDEN_STRING(STRINGIFY(WSCE_CONFIG_DIR)) L\\

Re: Is there any way to limit the concurrent running mappers per job?

2015-04-22 Thread Zhe Li
Thanks Naga for your reply. Does the community has a plan to support the limit per job in future? Thanks. On Tue, Apr 21, 2015 at 3:49 PM, Naganarasimha G R (Naga) garlanaganarasi...@huawei.com wrote: Hi Sanjeev, YARN already supports to map the deprecated configuration name to the new

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
Hi Anand, You should search /etc directory in root not Hadoop directory. On Wed, Apr 22, 2015 at 2:57 PM, Anand Murali anand_vi...@yahoo.com wrote: Dear All: I dont see a etc/host. Find below. anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al

RE: Is there any way to limit the concurrent running mappers per job?

2015-04-22 Thread Naganarasimha G R (Naga)
Hi Zhe, AFAIK there is no such explicit requirement to support MR clients limiting the number of containers/tasks for a given job at any given point of time. In fact as explained earlier Admin can control this by queue capacity, max capacity and user specific capacity configurations. Is there

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
I don't seem to have etc/host Sent from my iPhone On 22-Apr-2015, at 2:30 pm, sandeep vura sandeepv...@gmail.com wrote: Hi Anand, comment the ip address - 127.0.1.1 in /etc/hosts add the following ip address - 127.0.0.1 localhost in /etc/hosts. Restart your hadoop cluster after

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
hosts file will be available in /etc directory please check once. On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali anand_vi...@yahoo.com wrote: I don't seem to have etc/host Sent from my iPhone On 22-Apr-2015, at 2:30 pm, sandeep vura sandeepv...@gmail.com wrote: Hi Anand, comment the ip

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
Dear All: I dont see a etc/host. Find below. anand_vihar@Latitude-E5540:~$ cd hadoop-2.6.0 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ ls -al total 76 drwxr-xr-x 12 anand_vihar anand_vihar  4096 Apr 21 13:23 . drwxrwxr-x 26 anand_vihar anand_vihar  4096 Apr 22 14:05 .. drwxr-xr-x  2 anand_vihar

Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
Dear All: Has anyone encountered this error and if so how have you fixed it other then re-installing Hadoop or re-starting start-dfs.sh when you have already started after boot. Find below anand_vihar@Latitude-E5540:~$ ssh localhost Welcome to Ubuntu 14.10 (GNU/Linux 3.16.0-34-generic x86_64)  

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread sandeep vura
Hi Anand, comment the ip address - 127.0.1.1 in /etc/hosts add the following ip address - 127.0.0.1 localhost in /etc/hosts. Restart your hadoop cluster after made changes in /etc/hosts Regards, Sandeep.v On Wed, Apr 22, 2015 at 2:16 PM, Anand Murali anand_vi...@yahoo.com wrote: Dear All:

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
Ok thanks will do Sent from my iPhone On 22-Apr-2015, at 2:39 pm, sandeep vura sandeepv...@gmail.com wrote: hosts file will be available in /etc directory please check once. On Wed, Apr 22, 2015 at 2:36 PM, Anand Murali anand_vi...@yahoo.com wrote: I don't seem to have etc/host Sent

RE: rolling upgrade(2.4.1 to 2.6.0) problem

2015-04-22 Thread 조주일
I allocated 5G. I think OOM is not the cause of essentially -Original Message- From: Han-Cheol Cholt;hancheol@nhn-playart.comgt; To: lt;user@hadoop.apache.orggt;; Cc: Sent: 2015-04-22 (수) 15:32:35 Subject: RE: rolling upgrade(2.4.1 to 2.6.0) problem Hi, The first warning

RE: Problem with Hadoop 2.6 native build Windows

2015-04-22 Thread Kiran Kumar.M.R
Hi Yves, For 64-bit compilation, it should work out of box. What command are you using to build? Below build command works for me $mvn install -Pnative-win -DskipTests Ensure you have cmake also. Regards, Kiran

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Anand Murali
Dear Sandeep: many thanks. I did find hosts, but I do not have write priveleges, eventhough I am administrator. This is strange. Can you please advise. Thanks  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) On

Re: Connection Refused error on Hadoop-2.6.0 on Ubuntu 14.10 desktop running Pseudo Mode

2015-04-22 Thread Shahab Yunus
Can you try sudo? https://www.linux.com/learn/tutorials/306766:linux-101-introduction-to-sudo Regards, Shahab On Wed, Apr 22, 2015 at 8:26 AM, Anand Murali anand_vi...@yahoo.com wrote: Dear Sandeep: many thanks. I did find hosts, but I do not have write priveleges, eventhough I am