danhuawang opened a new issue, #9180:
URL: https://github.com/apache/gravitino/issues/9180
### Version
main branch
### Describe what's wrong
A normal user has privileges as following:
```
{
"code": 0,
"role": {
"name": "role1",
"audit": {
"creator": "anonymous",
"createTime": "2025-11-19T10:07:37.471984Z",
"lastModifier": "anonymous",
"lastModifiedTime": "2025-11-19T10:10:07.063395Z"
},
"properties": {
"k1": "v1"
},
"securableObjects": [
{
"type": "metalake",
"privileges": [
{
"name": "use_schema",
"condition": "allow"
},
{
"name": "use_catalog",
"condition": "allow"
},
{
"name": "create_table",
"condition": "allow"
}
],
"fullName": "test"
}
]
}
}
```
but lack of load_table privileges in spark sql
```
export SPARK_USER=normal
./spark-sql -v --conf
spark.plugins="org.apache.gravitino.spark.connector.plugin.GravitinoSparkPlugin"
--conf spark.sql.gravitino.uri=http://127.0.0.1:8090 --conf
spark.sql.gravitino.metalake=test --conf
spark.sql.gravitino.enableIcebergSupport=true --conf
spark.sql.gravitino.client.socketTimeoutMs=60000 --conf
spark.sql.gravitino.client.connectionTimeoutMs=60000 --conf
spark.sql.warehouse.dir=file:///tmp/
spark-sql ()> use dml;
use dml
Time taken: 0.049 seconds
spark-sql (dml)> CREATE TABLE test1 (id bigint COMMENT 'unique id');
CREATE TABLE test1 (id bigint COMMENT 'unique id')
25/11/19 18:10:31 ERROR SparkSQLDriver: Failed in [CREATE TABLE test1 (id
bigint COMMENT 'unique id')]
org.apache.gravitino.exceptions.ForbiddenException: User 'normal' is not
authorized to perform operation 'loadTable' on metadata 'test1'
at
org.apache.gravitino.client.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:394)
at
org.apache.gravitino.client.ErrorHandlers$TableErrorHandler.accept(ErrorHandlers.java:363)
at
org.apache.gravitino.client.HTTPClient.throwFailure(HTTPClient.java:236)
at
org.apache.gravitino.client.HTTPClient.execute(HTTPClient.java:388)
at
org.apache.gravitino.client.HTTPClient.execute(HTTPClient.java:296)
at org.apache.gravitino.client.HTTPClient.get(HTTPClient.java:465)
at org.apache.gravitino.client.RESTClient.get(RESTClient.java:165)
at
org.apache.gravitino.client.RelationalCatalog.loadTable(RelationalCatalog.java:120)
at
org.apache.gravitino.spark.connector.catalog.BaseCatalog.loadGravitinoTable(BaseCatalog.java:408)
at
org.apache.gravitino.spark.connector.catalog.BaseCatalog.loadTable(BaseCatalog.java:241)
at
org.apache.spark.sql.connector.catalog.TableCatalog.tableExists(TableCatalog.java:163)
at
org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:42)
at
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
at
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
at
org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
at
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:98)
at
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:118)
at
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:195)
at
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:103)
at
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
at
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:65)
at
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
at
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:94)
at
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
at
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104)
at
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
at
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
at
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
at
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
at
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
at
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
at
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
at
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:94)
at
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:81)
at
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:79)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
at
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:640)
at
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:630)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:671)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:651)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:67)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:415)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:533)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:527)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at scala.collection.IterableLike.foreach(IterableLike.scala:74)
at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:527)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:307)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
```
### Error message and/or stacktrace
see above
### How to reproduce
1. Add `normal` user into the metalake test
2. Create `role1` with privileges USE_CATALOG, USE_SCHEMA, CREATE_TABLE
3. Grant `role1` to `normal` user
4. export SPARK_USER=normal, create table through spark-sql
```
./spark-sql -v --conf
spark.plugins="org.apache.gravitino.spark.connector.plugin.GravitinoSparkPlugin"
--conf spark.sql.gravitino.uri=http://127.0.0.1:8090 --conf
spark.sql.gravitino.metalake=test --conf
spark.sql.gravitino.enableIcebergSupport=true --conf
spark.sql.gravitino.client.socketTimeoutMs=60000 --conf
spark.sql.gravitino.client.connectionTimeoutMs=60000 --conf
spark.sql.warehouse.dir=file:///tmp/
spark-sql (default)> use catalog_pg;
use catalog_pg
25/11/19 18:09:40 WARN ObjectStore: Failed to get database global_temp,
returning NoSuchObjectException
Time taken: 0.659 seconds
spark-sql ()> use dml;
use dml
Time taken: 0.049 seconds
spark-sql (dml)> CREATE TABLE test1 (id bigint COMMENT 'unique id');
CREATE TABLE test1 (id bigint COMMENT 'unique id')
25/11/19 18:10:31 ERROR SparkSQLDriver: Failed in [CREATE TABLE test1 (id
bigint COMMENT 'unique id')]
org.apache.gravitino.exceptions.ForbiddenException: User 'normal' is not
authorized to perform operation 'loadTable' on metadata 'test1'
```
### Additional context
_No response_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]