Note: I have built hadoop not in my home directory but rather in a
different volume.
--
Best Regards,
Karim Ahmed Awara
On Wed, Dec 18, 2013 at 10:41 AM, Karim Awara karim.aw...@kaust.edu.sawrote:
Still the same problem. IF you notice, The unit test actually created the
directories upto
Have you set umask to 022 ?
See https://issues.apache.org/jira/browse/HDFS-2556
Cheers
On Tue, Dec 17, 2013 at 3:12 PM, Karim Awara karim.aw...@kaust.edu.sawrote:
Hi,
I am running Junit test on hadoop 2.2.0 on eclipse on mac os x. Whenever I
run the test, I am faced with the following
Yes. Nothing yet. I should mention I compiled hadoop 2.2 from the src using
maven on a single machine (mac os x). It seems whatever I do in the
permissions, the error persists.
--
Best Regards,
Karim Ahmed Awara
On Wed, Dec 18, 2013 at 2:24 AM, Ted Yu yuzhih...@gmail.com wrote:
Have you set
What is weird is that, it has the right permissions in the directories
below, but for some reason it got deleted at the
build/test/data/dfs/data to be -- permission access
--
Best Regards,
Karim Ahmed Awara
On Wed, Dec 18, 2013 at 2:35 AM, Karim Awara karim.aw...@kaust.edu.sawrote:
Yes.
You have to start eclipse from an environment that has the correct umask
set, otherwise it will not inherit the settings.
Open a terminal, do umask 022 eclipse and re-try to run the tests.
- André
On Wed, Dec 18, 2013 at 12:35 AM, Karim Awara karim.aw...@kaust.edu.sawrote:
Yes. Nothing
Still the same problem. IF you notice, The unit test actually created the
directories upto $HOME_HDFS/build/test/data/dfs without problems at all.
I think that because MiniDFSCluster is emulating a cluster of one namenode
and two datanodes, it tries to create dir for th datanodes and this is
I have relaxed it even further so now it is 775
kevin@devUbuntu05:/var/log/hadoop-0.20-mapreduce$ hadoop fs -ls -d /
Found 1 items
drwxrwxr-x - hdfs supergroup 0 2013-04-29 15:43 /
But I still get this error:
2013-04-30 07:43:02,520 FATAL
the
permission to 775 so that the group would also have write permission but
that didn't seem to help.
From: Mohammad Tariq [mailto:donta...@gmail.com]
Sent: Tuesday, April 30, 2013 8:20 AM
To: Kevin Burton
Subject: Re: Permission problem
user?ls shows hdfs and the log says mapred..
Warm
for hadoop hdfs and mr. Ideas?
From: Kevin Burton [mailto:rkevinbur...@charter.net]
Sent: Tuesday, April 30, 2013 8:31 AM
To: user@hadoop.apache.org
Cc: 'Mohammad Tariq'
Subject: RE: Permission problem
That is what I perceive as the problem. The hdfs file system was created
with the user 'hdfs
@hadoop.apache.org
Cc: 'Mohammad Tariq'
Subject: RE: Permission problem
That is what I perceive as the problem. The hdfs file system was created with
the user ‘hdfs’ owning the root (‘/’) but for some reason with a M/R job the
user ‘mapred’ needs to have write permission to the root. I don’t know
=/:hdfs:supergroup:drwxrwxr-x
. . . . . .
From: Arpit Gupta [mailto:ar...@hortonworks.com]
Sent: Tuesday, April 30, 2013 9:25 AM
To: user@hadoop.apache.org
Subject: Re: Permission problem
what is your mapred.system.dir set to in mapred-site.xml?
By default it will write to /tmp on hdfs.
So you can
:
user=mapred, access=WRITE, inode=/:hdfs:supergroup:drwxrwxr-x
. . . . . .
From: Arpit Gupta [mailto:ar...@hortonworks.com]
Sent: Tuesday, April 30, 2013 9:25 AM
To: user@hadoop.apache.org
Subject: Re: Permission problem
what is your mapred.system.dir set to in mapred-site.xml
:25 AM
To: user@hadoop.apache.org
Subject: Re: Permission problem
what is your mapred.system.dir set to in mapred-site.xml?
By default it will write to /tmp on hdfs.
So you can do the following
create /tmp on hdfs and chmod it to 777 as user hdfs and then restart
jobtracker
To: Kevin Burton
Cc: user@hadoop.apache.org
Subject: Re: Permission problem
Based on the logs your system dir is set to
hdfs://devubuntu05:9000/data/hadoop/tmp/hadoop-mapred/mapred/system
what is your fs.default.name and hadoop.tmp.dir in core-site.xml set to?
--
Arpit Gupta
@hadoop.apache.org
Subject: Re: Permission problem
ah
this is what mapred.sytem.dir defaults to
property
namemapred.system.dir/name
value${hadoop.tmp.dir}/mapred/system/value
descriptionThe directory where MapReduce stores control files.
/description
/property
So thats why its
[mailto:ar...@hortonworks.com]
Sent: Tuesday, April 30, 2013 10:48 AM
To: Kevin Burton
Cc: user@hadoop.apache.org
Subject: Re: Permission problem
It looks like hadoop.tmp.dir is being used both for local and hdfs
directories. Can you create a jira for this?
What i recommended is that you create
tried failed.
image001.png
*From:* Arpit Gupta [mailto:ar...@hortonworks.com ar...@hortonworks.com]
*Sent:* Tuesday, April 30, 2013 11:02 AM
*To:* user@hadoop.apache.org
*Subject:* Re: Permission problem
https://issues.apache.org/jira/browse/HADOOP and select create issue.
Set the affect
...@hortonworks.com]
*Sent:* Tuesday, April 30, 2013 11:02 AM
*To:* user@hadoop.apache.org
*Subject:* Re: Permission problem
https://issues.apache.org/jira/browse/HADOOP and select create issue.
Set the affect version to the release you are testing and add some basic
description.
Here
18 matches
Mail list logo