[ 
https://issues.apache.org/jira/browse/SPARK-46621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-46621:
---------------------------------
    Description: 
If JVM throws an exception without a message, the message becomes null and 
returns:

{code}
  File "/.../pyspark/errors/exceptions/captured.py", line 88, in __str__
    desc = desc + "\n\nJVM stacktrace:\n%s" % self._stackTrace
TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
{code}

  was:
If JVM throws an exception without a message, the message becomes null and 
returns:

{code}
pyspark.errors.exceptions.captured.UnsupportedOperationException: 
JVM stacktrace:
java.lang.UnsupportedOperationException
        at 
com.databricks.sql.acl.PlaceholderScimClient.getUserInfo(MockScimClient.scala:49)
        at 
com.databricks.sql.acl.InlineUserInfoExpressions.userInfo$lzycompute$1(InlineUserInfoExpressions.scala:73)
        at 
com.databricks.sql.acl.InlineUserInfoExpressions.com$databricks$sql$acl$InlineUserInfoExpressions$$userInfo$1(InlineUserInfoExpressions.scala:73)
        at 
com.databricks.sql.acl.InlineUserInfoExpressions$$anonfun$rewrite$2.$anonfun$applyOrElse$2(InlineUserInfoExpressions.scala:98)
        at scala.Option.getOrElse(Option.scala:189)
        at 
com.databricks.sql.acl.InlineUserInfoExpressions$$anonfun$rewrite$2.applyOrElse(InlineUserInfoExpressions.scala:98)
        at 
com.databricks.sql.acl.InlineUserInfoExpressions$$anonfun$rewrite$2.applyOrElse(InlineUserInfoExpressions.scala:84)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:473)
        at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:473)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$3(TreeNode.scala:478)
        at 
org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren(TreeNode.scala:1277)
        at 
org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren$(TreeNode.scala:1276)
        at 
org.apache.spark.sql.catalyst.expressions.UnaryExpression.mapChildren(Expression.scala:656)
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:478)
        at 
org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformExpressionsDownWithPruning$1(QueryPlan.scala:174)
        at 
org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:215)
        at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83)
        at 
org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:215)
        at 
org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:226)
        at 
org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$3(QueryPlan.scala:231)
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at scala.collection.TraversableLike.map(TraversableLike.scala:286)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
        at scala.collection.AbstractTraversable.map(Traversable.scala:108)
{code}


> Address null from Exception.getMessage in Py4J captured exception
> -----------------------------------------------------------------
>
>                 Key: SPARK-46621
>                 URL: https://issues.apache.org/jira/browse/SPARK-46621
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 4.0.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> If JVM throws an exception without a message, the message becomes null and 
> returns:
> {code}
>   File "/.../pyspark/errors/exceptions/captured.py", line 88, in __str__
>     desc = desc + "\n\nJVM stacktrace:\n%s" % self._stackTrace
> TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to