Though I agree with others that it would probably be easier to get Hadoop
up and running on Unix based systems, couldn't help notice that this path:

 \tmp \hadoop-upendyal\mapred\staging\upendyal-1075683580\.staging

seems to have a space in the first component i.e '\tmp ' and not '\tmp'. Is
that a copy paste issue, or is it really the case. Again, not sure if it
could cause the specific error you're seeing, but could try removing the
space if it does exist. Also assuming that you've set up Cygwin etc. if you
still want to try out on Windows.

Thanks
hemanth

On Wed, Sep 5, 2012 at 12:12 AM, Marcos Ortiz <mlor...@uci.cu> wrote:

>
> On 09/04/2012 02:35 PM, Udayini Pendyala wrote:
>
>   Hi Bejoy,
>
> Thanks for your response. I first started to install on Ubuntu Linux and
> ran into a bunch of problems. So, I wanted to back off a bit and try
> something simple first. Hence, my attempt to install on my Windows 7 Laptop.
>
> Well, if you tell to us the problems that you have in Ubuntu, we can give
> you a hand.
> Michael Noll have great tutorials for this:
>
> Running Hadoop on Ubuntu Linux (Single node cluster)
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>
> Running Hadoop on Ubuntu Linux (Multi node cluster)
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/
>
>
> I am doing the "standalone" mode - as per the documentation (link in my
> original email), I don't need ssh unless I am doing the distributed mode.
> Is that not correct?
>
> Yes, but I give you the same recommendation that Bejoy said to you: Use a
> Unix-based platform for Hadoop, it's more tested and have better
> performance than Windows.
>
> Best wishes
>
>
> Thanks again for responding
> Udayini
>
>
> --- On *Tue, 9/4/12, Bejoy Ks <bejoy.had...@gmail.com><bejoy.had...@gmail.com>
> * wrote:
>
>
> From: Bejoy Ks <bejoy.had...@gmail.com> <bejoy.had...@gmail.com>
> Subject: Re: Exception while running a Hadoop example on a standalone
> install on Windows 7
> To: user@hadoop.apache.org
> Date: Tuesday, September 4, 2012, 11:11 AM
>
> Hi Udayani
>
>  By default hadoop works well for linux and linux based OS. Since you are
> on Windows you need to install and configure ssh using cygwin before you
> start hadoop daemons.
>
>  On Tue, Sep 4, 2012 at 6:16 PM, Udayini Pendyala <
> udayini_pendy...@yahoo.com<http://mc/compose?to=udayini_pendy...@yahoo.com>
> > wrote:
>
>   Hi,
>
>
>  Following is a description of what I am trying to do and the steps I
> followed.
>
>
>  GOAL:
>
> a). Install Hadoop 1.0.3
>
> b). Hadoop in a standalone (or local) mode
>
> c). OS: Windows 7
>
>
>  STEPS FOLLOWED:
>
> 1.    1.   I followed instructions from:
> http://www.oreillynet.com/pub/a/other-programming/excerpts/hadoop-tdg/installing-apache-hadoop.html.
> Listing the steps I did -
>
> a.       I went to: http://hadoop.apache.org/core/releases.html.
>
> b.      I installed hadoop-1.0.3 by downloading “hadoop-1.0.3.tar.gz” and
> unzipping/untarring the file.
>
> c.       I installed JDK 1.6 and set up JAVA_HOME to point to it.
>
> d.      I set up HADOOP_INSTALL to point to my Hadoop install location. I
> updated my PATH variable to have $HADOOP_INSTALL/bin
>
> e.      After the above steps, I ran the command: “hadoop version” and
> got the following information:
>
> $ hadoop version
>
> Hadoop 1.0.3
>
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1335192
>
> Compiled by hortonfo on Tue May 8 20:31:25 UTC 2012
>
> From source with checksum e6b0c1e23dcf76907c5fecb4b832f3be
>
>
>
> 2.      2.  The standalone was very easy to install as described above.
> Then, I tried to run a sample command as given in:
>
> http://hadoop.apache.org/common/docs/r0.17.2/quickstart.html#Local
>
> Specifically, the steps followed were:
>
> a.       cd $HADOOP_INSTALL
>
> b.      mkdir input
>
> c.       cp conf/*.xml input
>
> d.      bin/hadoop jar hadoop-examples-1.0.3.jar grep input output
> ‘dfs[a-z.]+’
>
> and got the following error:
>
>
>
> $ bin/hadoop jar hadoop-examples-1.0.3.jar grep input output 'dfs[a-z.]+'
>
> 12/09/03 15:41:57 WARN util.NativeCodeLoader: Unable to load native-hadoop
> libra ry for your platform... using builtin-java classes where applicable
>
> 12/09/03 15:41:57 ERROR security.UserGroupInformation:
> PriviledgedActionExceptio n as:upendyal cause:java.io.IOException: Failed
> to set permissions of path: \tmp
> \hadoop-upendyal\mapred\staging\upendyal-1075683580\.staging to 0700
>
> java.io <http://java.io.IO>.IOException: Failed to set permissions of
> path: \tmp\hadoop-upendyal\map red\staging\upendyal-1075683580\.staging to
> 0700
>
> at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
>
> at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
>
> at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> tem.java:509)
>
> at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> a:344)
>
> at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18 9)
>
> at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> ssionFiles.java:116)
>
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
>
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Unknown Source)
>
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> tion.java:1121)
>
> at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:8
> 50)
>
> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
>
> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
>
> at org.apache.hadoop.examples.Grep.run(Grep.java:69)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
> at org.apache.hadoop.examples.Grep.main(Grep.java:93)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>
> at java.lang.reflect.Method.invoke(Unknown Source)
>
> at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(Progra
> mDriver.java:68)
>
> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>
> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>
> at java.lang.reflect.Method.invoke(Unknown Source)
>
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
>
>
> 3.    3.   I googled the problem and found the following links but none
> of these suggestions helped. Most people seem to be getting a resolution
> when they change the version of Hadoop.
>
> a.
> http://mail-archives.apache.org/mod_mbox/hadoop-common-user/201105.mbox/%3cbanlktin-8+z8uybtdmaa4cvxz4jzm14...@mail.gmail.com%3E
>
> b.
> http://comments.gmane.org/gmane.comp.jakarta.lucene.hadoop.user/25837
>
>
>  Is this a problem in the version of Hadoop I selected OR am I doing
> something wrong? I would appreciate any help with this.
>
> Thanks
>
> Udayini
>
>
>
> --
> **
>
> Marcos Luis Ortíz Valmaseda
> *Data Engineer && Sr. System Administrator at UCI*
> about.me/marcosortiz
> My Blog <http://marcosluis2186.posterous.com>
> @marcosluis2186 <http://twitter.com/marcosluis2186>
>  **
>
>
>
>   <http://www.uci.cu/>
>
>

Reply via email to