Max Gekk created SPARK-41015:
--------------------------------

             Summary: Failure of ProtobufCatalystDataConversionSuite.scala
                 Key: SPARK-41015
                 URL: https://issues.apache.org/jira/browse/SPARK-41015
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.4.0
            Reporter: Max Gekk


To reproduce the issue, set the seed to 38:

{code:diff}
diff --git 
a/connector/protobuf/src/test/scala/org/apache/spark/sql/protobuf/ProtobufCatalystDataConversionSuite.scala
 
b/connector/protobuf/src/test/scala/org/apache/spark/sql/protobuf/ProtobufCatalystDataConversionSuite.scala
index 271c5b0fec..080bf1eb1f 100644
--- 
a/connector/protobuf/src/test/scala/org/apache/spark/sql/protobuf/ProtobufCatalystDataConversionSuite.scala
+++ 
b/connector/protobuf/src/test/scala/org/apache/spark/sql/protobuf/ProtobufCatalystDataConversionSuite.scala
@@ -123,7 +123,7 @@ class ProtobufCatalystDataConversionSuite
     StringType -> ("StringMsg", ""))

   testingTypes.foreach { dt =>
-    val seed = 1 + scala.util.Random.nextInt((1024 - 1) + 1)
+    val seed = 38
     test(s"single $dt with seed $seed") {

       val (messageName, defaultValue) = 
catalystTypesToProtoMessages(dt.fields(0).dataType)
{code}

and run the test:

{code}
build/sbt "test:testOnly *ProtobufCatalystDataConversionSuite"
{code}
which fails with NPE:
{code}
[info] - single StructType(StructField(double_type,DoubleType,true)) with seed 
38 *** FAILED *** (10 milliseconds)
[info]   java.lang.NullPointerException:
[info]   at 
org.apache.spark.sql.protobuf.ProtobufCatalystDataConversionSuite.$anonfun$new$2(ProtobufCatalystDataConversionSuite.scala:134)
[info]   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
{code}





--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to