Hi, Spark users.

Currently I have to make multiple tables in hdfs using spark api.

The tables need to made by each other users.


For example, table01 is owned by user01, table02 is owned by user02 like
below.


path                                     | owner:group   | permission

/data/table01/                      | user01:spark   |770

/data/table01/_SUCESS     | user01:spark   |770

/data/table01/part_xxxxx     | user01:spark   |770

/data/table01/part_xxxxx     | user01:spark   |770

...

/data/table02/                      | user02:spark   |770

...

/data/table03/                      | user03:spark   |770

...




Actually I used the UGI to make them. but the directories was made as i
expect.

But the files (part_xxxxx) was made by the user that launched the spark
application.


Is it possible to do what i want ?
  • Is it enable to us... Kwangsun Noh

Reply via email to