pilietis93 commented on PR #5468:
URL: https://github.com/apache/kyuubi/pull/5468#issuecomment-1906032156
@xza-m I found a reason why you're not able to reproduce this.
```
val df0 = spark.table("default.src").selectExpr("'col0' col0", "'col1'
col1")
doAs(
"bob",
assert(
df0.as("a").join(
right = df0.as("b"),
joinExprs = $"a.col0" === $"b.col0" && $"a.col1" === $"b.col1",
joinType = "left_outer").collect() ===
Seq(Row("col0", "col1", "col0", "col1"))))
```
Your test is using derived columns as join keys which are not part of the
source table. This generates a different plan which doesn't trigger the bug. I
can reproduce this by using `key` column instead:
``` doAs(
"bob", {
val df0 = spark.table("default.src")
val df1 = df0.as("a").join(
right = df0.as("b"),
joinExprs = $"a.key" === $"b.key",
joinType = "left_outer")
df1.explain(true) // crash
}
)
```
fails with
```java.lang.ClassCastException:
org.apache.kyuubi.plugin.spark.authz.ranger.datamasking.DataMaskingStage1Marker
cannot be cast to org.apache.spark.sql.catalyst.plans.logical.Join
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]