Re: Is it enable to use Multiple UGIs in One Spark Context?

2021-03-25 Thread יורי אולייניקוב
I think that submitting the spark job on behalf of user01 will solve the problem. You may also try to set a sticky bit on /data/user01/rdd folder if you want to allow multiple users writing to /data/user01/rdd same at same time, but i'd not recommend allow multiple users writing to same dir

Re: Is it enable to use Multiple UGIs in One Spark Context?

2021-03-25 Thread Kwangsun Noh
Thank you for the first answer to my question. Unfortunately, I have to make totally different tables and It is not possible to make only one table via UGI. --- below is the sample codes I wrote. org.apache.hadoop.security.UserGroupInformation.createRemoteUser("user01").doAs(new

Re: Is it enable to use Multiple UGIs in One Spark Context?

2021-03-25 Thread Yuri Oleynikov (‫יורי אולייניקוב‬‎)
Assuming that all tables have same schema, you can make entire global table partitioned by some column. Then apply specific UGOs permissions/ACLs per partition subdirectory > On 25 Mar 2021, at 15:13, Kwangsun Noh wrote: > >  > Hi, Spark users. > > Currently I have to make multiple tables

Is it enable to use Multiple UGIs in One Spark Context?

2021-03-25 Thread Kwangsun Noh
Hi, Spark users. Currently I have to make multiple tables in hdfs using spark api. The tables need to made by each other users. For example, table01 is owned by user01, table02 is owned by user02 like below. path | owner:group | permission