Re: Hadoop setup questions

2009-02-13 Thread Rasit OZDAS
With this configuration, any user having that group name will be able
to write to any location..
(I've tried this in local network, though)


2009/2/14 Rasit OZDAS :
> I agree with Amar and James,
>
> if you require permissions for your project,
> then
> 1. create a group in linux for your user.
> 2. give group write access to all files in HDFS. (hadoop dfs -chmod -R
> g+w /  - or sth, I'm not totally sure.)
> 3. change group ownership of all files in HDFS. (hadoop dfs -chgrp -R
>  /       - I'm not totally sure again..)
>
> cheers,
> Rasit
>
>
> 2009/2/12 james warren :
>> Like Amar said.  Try adding
>>
>> 
>> dfs.permissions
>> false
>> 
>>
>>
>> to your conf/hadoop-site.xml file (or flip the value in hadoop-default.xml),
>> restart your daemons and give it a whirl.
>>
>> cheers,
>> -jw
>>
>> On Wed, Feb 11, 2009 at 8:44 PM, Amar Kamat  wrote:
>>
>>> bjday wrote:
>>>
 Good morning everyone,

 I have a question about correct setup for hadoop.  I have 14 Dell
 computers in a lab.   Each connected to the internet and each independent 
 of
 each other.  All run CentOS.  Logins are handled by NIS.  If userA logs 
 into
 the master and starts the daemons and UserB logs into the master and wants
 to run a job while the daemons from UserA are still running the following
 error occurs:

 copyFromLocal: org.apache.hadoop.security.AccessControlException:
 Permission denied: user=UserB, access=WRITE,
 inode="user":UserA:supergroup:rwxr-xr-x

>>> Looks like one of your files (input or output) is of different user. Seems
>>> like your DFS has permissions enabled. If you dont require permissions then
>>> disable it else make sure that the input/output paths are under your
>>> permission (/user/userB is the hone directory for userB).
>>> Amar
>>>
>>>
 what needs to be changed to allow UserB-UserZ to run their jobs?  Does
 there need to be a local user the everyone logs into as and run from there?
  Should Hadoop be ran in an actual cluster instead of independent 
 computers?
  Any ideas what is the correct configuration settings that allow it?

 I followed Ravi Phulari suggestions and followed:

 http://hadoop.apache.org/core/docs/current/quickstart.html

 http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Multi-Node_Cluster)<
 http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Multi-Node_Cluster%29>


 http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster)<
 http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29>


 These allowed me to get Hadoop running on the 14 computers when I login
 and everything works fine, thank you Ravi.  The problem occurs when
 additional people attempt to run jobs simultaneously.

 Thank you,

 Brian


>>>
>>
>
>
>
> --
> M. Raşit ÖZDAŞ
>



-- 
M. Raşit ÖZDAŞ


Re: Hadoop setup questions

2009-02-13 Thread Rasit OZDAS
I agree with Amar and James,

if you require permissions for your project,
then
1. create a group in linux for your user.
2. give group write access to all files in HDFS. (hadoop dfs -chmod -R
g+w /  - or sth, I'm not totally sure.)
3. change group ownership of all files in HDFS. (hadoop dfs -chgrp -R
 /   - I'm not totally sure again..)

cheers,
Rasit


2009/2/12 james warren :
> Like Amar said.  Try adding
>
> 
> dfs.permissions
> false
> 
>
>
> to your conf/hadoop-site.xml file (or flip the value in hadoop-default.xml),
> restart your daemons and give it a whirl.
>
> cheers,
> -jw
>
> On Wed, Feb 11, 2009 at 8:44 PM, Amar Kamat  wrote:
>
>> bjday wrote:
>>
>>> Good morning everyone,
>>>
>>> I have a question about correct setup for hadoop.  I have 14 Dell
>>> computers in a lab.   Each connected to the internet and each independent of
>>> each other.  All run CentOS.  Logins are handled by NIS.  If userA logs into
>>> the master and starts the daemons and UserB logs into the master and wants
>>> to run a job while the daemons from UserA are still running the following
>>> error occurs:
>>>
>>> copyFromLocal: org.apache.hadoop.security.AccessControlException:
>>> Permission denied: user=UserB, access=WRITE,
>>> inode="user":UserA:supergroup:rwxr-xr-x
>>>
>> Looks like one of your files (input or output) is of different user. Seems
>> like your DFS has permissions enabled. If you dont require permissions then
>> disable it else make sure that the input/output paths are under your
>> permission (/user/userB is the hone directory for userB).
>> Amar
>>
>>
>>> what needs to be changed to allow UserB-UserZ to run their jobs?  Does
>>> there need to be a local user the everyone logs into as and run from there?
>>>  Should Hadoop be ran in an actual cluster instead of independent computers?
>>>  Any ideas what is the correct configuration settings that allow it?
>>>
>>> I followed Ravi Phulari suggestions and followed:
>>>
>>> http://hadoop.apache.org/core/docs/current/quickstart.html
>>>
>>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Multi-Node_Cluster)<
>>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Multi-Node_Cluster%29>
>>>
>>>
>>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster)<
>>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29>
>>>
>>>
>>> These allowed me to get Hadoop running on the 14 computers when I login
>>> and everything works fine, thank you Ravi.  The problem occurs when
>>> additional people attempt to run jobs simultaneously.
>>>
>>> Thank you,
>>>
>>> Brian
>>>
>>>
>>
>



-- 
M. Raşit ÖZDAŞ


Re: Hadoop setup questions

2009-02-11 Thread james warren
Like Amar said.  Try adding


dfs.permissions
false



to your conf/hadoop-site.xml file (or flip the value in hadoop-default.xml),
restart your daemons and give it a whirl.

cheers,
-jw

On Wed, Feb 11, 2009 at 8:44 PM, Amar Kamat  wrote:

> bjday wrote:
>
>> Good morning everyone,
>>
>> I have a question about correct setup for hadoop.  I have 14 Dell
>> computers in a lab.   Each connected to the internet and each independent of
>> each other.  All run CentOS.  Logins are handled by NIS.  If userA logs into
>> the master and starts the daemons and UserB logs into the master and wants
>> to run a job while the daemons from UserA are still running the following
>> error occurs:
>>
>> copyFromLocal: org.apache.hadoop.security.AccessControlException:
>> Permission denied: user=UserB, access=WRITE,
>> inode="user":UserA:supergroup:rwxr-xr-x
>>
> Looks like one of your files (input or output) is of different user. Seems
> like your DFS has permissions enabled. If you dont require permissions then
> disable it else make sure that the input/output paths are under your
> permission (/user/userB is the hone directory for userB).
> Amar
>
>
>> what needs to be changed to allow UserB-UserZ to run their jobs?  Does
>> there need to be a local user the everyone logs into as and run from there?
>>  Should Hadoop be ran in an actual cluster instead of independent computers?
>>  Any ideas what is the correct configuration settings that allow it?
>>
>> I followed Ravi Phulari suggestions and followed:
>>
>> http://hadoop.apache.org/core/docs/current/quickstart.html
>>
>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Multi-Node_Cluster)<
>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Multi-Node_Cluster%29>
>>
>>
>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster)<
>> http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29>
>>
>>
>> These allowed me to get Hadoop running on the 14 computers when I login
>> and everything works fine, thank you Ravi.  The problem occurs when
>> additional people attempt to run jobs simultaneously.
>>
>> Thank you,
>>
>> Brian
>>
>>
>


Re: Hadoop setup questions

2009-02-11 Thread Amar Kamat

bjday wrote:

Good morning everyone,

I have a question about correct setup for hadoop.  I have 14 Dell 
computers in a lab.   Each connected to the internet and each 
independent of each other.  All run CentOS.  Logins are handled by 
NIS.  If userA logs into the master and starts the daemons and UserB 
logs into the master and wants to run a job while the daemons from 
UserA are still running the following error occurs:


copyFromLocal: org.apache.hadoop.security.AccessControlException: 
Permission denied: user=UserB, access=WRITE, 
inode="user":UserA:supergroup:rwxr-xr-x
Looks like one of your files (input or output) is of different user. 
Seems like your DFS has permissions enabled. If you dont require 
permissions then disable it else make sure that the input/output paths 
are under your permission (/user/userB is the hone directory for userB).

Amar


what needs to be changed to allow UserB-UserZ to run their jobs?  Does 
there need to be a local user the everyone logs into as and run from 
there?  Should Hadoop be ran in an actual cluster instead of 
independent computers?  Any ideas what is the correct configuration 
settings that allow it?


I followed Ravi Phulari suggestions and followed:

http://hadoop.apache.org/core/docs/current/quickstart.html
http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Multi-Node_Cluster) 
 

http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster) 
 



These allowed me to get Hadoop running on the 14 computers when I 
login and everything works fine, thank you Ravi.  The problem occurs 
when additional people attempt to run jobs simultaneously.


Thank you,

Brian





Hadoop setup questions

2009-02-11 Thread bjday

Good morning everyone,

I have a question about correct setup for hadoop.  I have 14 Dell 
computers in a lab.   Each connected to the internet and each 
independent of each other.  All run CentOS.  Logins are handled by NIS.  
If userA logs into the master and starts the daemons and UserB logs into 
the master and wants to run a job while the daemons from UserA are still 
running the following error occurs:


copyFromLocal: org.apache.hadoop.security.AccessControlException: 
Permission denied: user=UserB, access=WRITE, 
inode="user":UserA:supergroup:rwxr-xr-x


what needs to be changed to allow UserB-UserZ to run their jobs?  Does 
there need to be a local user the everyone logs into as and run from 
there?  Should Hadoop be ran in an actual cluster instead of independent 
computers?  Any ideas what is the correct configuration settings that 
allow it?


I followed Ravi Phulari suggestions and followed:

http://hadoop.apache.org/core/docs/current/quickstart.html
http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Multi-Node_Cluster) 

http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster) 



These allowed me to get Hadoop running on the 14 computers when I login 
and everything works fine, thank you Ravi.  The problem occurs when 
additional people attempt to run jobs simultaneously.


Thank you,

Brian