LuciferYang commented on PR #42069:
URL: https://github.com/apache/spark/pull/42069#issuecomment-1649433054

   > checked maven test with this pr, there are `10 TESTS FAILED`, further 
confirmation is needed to confirm whether all are related to this pr:
   > 
   > run
   > 
   > ```
   > build/mvn clean install -DskipTests -Phive
   > build/mvn clean test -pl connector/connect/client/jvm
   > ```
   > 
   > ```
   > FlatMapGroupsWithStateStreamingSuite:
   > - flatMapGroupsWithState - streaming *** FAILED ***
   >   org.apache.spark.SparkException: RST_STREAM closed stream. HTTP/2 error 
code: PROTOCOL_ERROR
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
   >   at scala.collection.Iterator.toStream(Iterator.scala:1417)
   >   at scala.collection.Iterator.toStream$(Iterator.scala:1416)
   >   at scala.collection.AbstractIterator.toStream(Iterator.scala:1431)
   >   at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:354)
   >   at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:354)
   >   at scala.collection.AbstractIterator.toSeq(Iterator.scala:1431)
   >   ...
   > - flatMapGroupsWithState - streaming - with initial state *** FAILED ***
   >   org.apache.spark.SparkException: RST_STREAM closed stream. HTTP/2 error 
code: PROTOCOL_ERROR
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
   >   at scala.collection.Iterator.toStream(Iterator.scala:1417)
   >   at scala.collection.Iterator.toStream$(Iterator.scala:1416)
   >   at scala.collection.AbstractIterator.toStream(Iterator.scala:1431)
   >   at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:354)
   >   at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:354)
   >   at scala.collection.AbstractIterator.toSeq(Iterator.scala:1431)
   >   ...
   > - mapGroupsWithState - streaming *** FAILED ***
   >   org.apache.spark.SparkException: RST_STREAM closed stream. HTTP/2 error 
code: PROTOCOL_ERROR
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
   >   at scala.collection.Iterator.toStream(Iterator.scala:1417)
   >   at scala.collection.Iterator.toStream$(Iterator.scala:1416)
   >   at scala.collection.AbstractIterator.toStream(Iterator.scala:1431)
   >   at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:354)
   >   at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:354)
   >   at scala.collection.AbstractIterator.toSeq(Iterator.scala:1431)
   >   ...
   > - mapGroupsWithState - streaming - with initial state *** FAILED ***
   >   org.apache.spark.SparkException: RST_STREAM closed stream. HTTP/2 error 
code: PROTOCOL_ERROR
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
   >   at scala.collection.Iterator.toStream(Iterator.scala:1417)
   >   at scala.collection.Iterator.toStream$(Iterator.scala:1416)
   >   at scala.collection.AbstractIterator.toStream(Iterator.scala:1431)
   >   at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:354)
   >   at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:354)
   >   at scala.collection.AbstractIterator.toSeq(Iterator.scala:1431)
   >   ...
   > - flatMapGroupsWithState *** FAILED ***
   >   org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 489.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
489.0 (TID 1997) (localhost executor driver): java.lang.ClassCastException: 
org.apache.spark.sql.ClickState cannot be cast to 
org.apache.spark.sql.ClickState
   >    at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(generated.java:87)
   >    at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
   >    at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
   >    at 
org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchIterator.hasNext(ArrowConverters.scala:100)
   >    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
   >    at scala.collection.Iterator.foreach(Iterator.scala:943)
   >    at scala.collection.Iterator.foreach$(Iterator.scala:943)
   >    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   >    at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
   >    at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
   >    at scala.collection.TraversableOnce.to(TraversableOnce.scala:366)
   >    at scala.collection.TraversableOnce.to$(TraversableOnce.scala:364)
   >    at scala.collection.AbstractIterator.to(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:358)
   >    at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:358)
   >    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:345)
   >    at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:339)
   >    at scala.collection.AbstractIterator.toArray(Iterator.scala:1431)
   >    at 
org.apache.spark.sql.connect.execution.SparkConnectPlanExecution.$anonfun$processAsArrowBatches$4(Sp...
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:83)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:153)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:183)
   >   at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   >   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   >   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   >   at 
org.apache.spark.sql.connect.client.util.QueryTest.checkDataset(QueryTest.scala:54)
   >   ...
   > - flatMapGroupsWithState - with initial state *** FAILED ***
   >   org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 494.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
494.0 (TID 2006) (localhost executor driver): java.lang.ClassCastException: 
org.apache.spark.sql.ClickState cannot be cast to 
org.apache.spark.sql.ClickState
   >    at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage3.processNext(generated.java:87)
   >    at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
   >    at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
   >    at 
org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchIterator.hasNext(ArrowConverters.scala:100)
   >    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
   >    at scala.collection.Iterator.foreach(Iterator.scala:943)
   >    at scala.collection.Iterator.foreach$(Iterator.scala:943)
   >    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   >    at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
   >    at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
   >    at scala.collection.TraversableOnce.to(TraversableOnce.scala:366)
   >    at scala.collection.TraversableOnce.to$(TraversableOnce.scala:364)
   >    at scala.collection.AbstractIterator.to(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:358)
   >    at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:358)
   >    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:345)
   >    at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:339)
   >    at scala.collection.AbstractIterator.toArray(Iterator.scala:1431)
   >    at 
org.apache.spark.sql.connect.execution.SparkConnectPlanExecution.$anonfun$processAsArrowBatches$4(Sp...
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:83)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:153)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:183)
   >   at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   >   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   >   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   >   at 
org.apache.spark.sql.connect.client.util.QueryTest.checkDataset(QueryTest.scala:54)
   >   ...
   > - mapGroupsWithState *** FAILED ***
   >   org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 497.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
497.0 (TID 2013) (localhost executor driver): java.lang.ClassCastException: 
org.apache.spark.sql.ClickState cannot be cast to 
org.apache.spark.sql.ClickState
   >    at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(generated.java:87)
   >    at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
   >    at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
   >    at 
org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchIterator.hasNext(ArrowConverters.scala:100)
   >    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
   >    at scala.collection.Iterator.foreach(Iterator.scala:943)
   >    at scala.collection.Iterator.foreach$(Iterator.scala:943)
   >    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   >    at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
   >    at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
   >    at scala.collection.TraversableOnce.to(TraversableOnce.scala:366)
   >    at scala.collection.TraversableOnce.to$(TraversableOnce.scala:364)
   >    at scala.collection.AbstractIterator.to(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:358)
   >    at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:358)
   >    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:345)
   >    at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:339)
   >    at scala.collection.AbstractIterator.toArray(Iterator.scala:1431)
   >    at 
org.apache.spark.sql.connect.execution.SparkConnectPlanExecution.$anonfun$processAsArrowBatches$4(Sp...
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:83)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:153)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:183)
   >   at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   >   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   >   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   >   at 
org.apache.spark.sql.connect.client.util.QueryTest.checkDataset(QueryTest.scala:54)
   >   ...
   > - mapGroupsWithState - with initial state *** FAILED ***
   >   org.apache.spark.SparkException: Job aborted due to stage failure: Task 
0 in stage 502.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
502.0 (TID 2022) (localhost executor driver): java.lang.ClassCastException: 
org.apache.spark.sql.ClickState cannot be cast to 
org.apache.spark.sql.ClickState
   >    at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage3.processNext(generated.java:87)
   >    at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
   >    at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
   >    at 
org.apache.spark.sql.execution.arrow.ArrowConverters$ArrowBatchIterator.hasNext(ArrowConverters.scala:100)
   >    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
   >    at scala.collection.Iterator.foreach(Iterator.scala:943)
   >    at scala.collection.Iterator.foreach$(Iterator.scala:943)
   >    at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   >    at scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
   >    at scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
   >    at 
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
   >    at scala.collection.TraversableOnce.to(TraversableOnce.scala:366)
   >    at scala.collection.TraversableOnce.to$(TraversableOnce.scala:364)
   >    at scala.collection.AbstractIterator.to(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:358)
   >    at scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:358)
   >    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1431)
   >    at scala.collection.TraversableOnce.toArray(TraversableOnce.scala:345)
   >    at scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:339)
   >    at scala.collection.AbstractIterator.toArray(Iterator.scala:1431)
   >    at 
org.apache.spark.sql.connect.execution.SparkConnectPlanExecution.$anonfun$processAsArrowBatches$4(Sp...
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   >   at 
org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:83)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:153)
   >   at 
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:183)
   >   at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   >   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   >   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   >   at 
org.apache.spark.sql.connect.client.util.QueryTest.checkDataset(QueryTest.scala:54)
   >   ...
   > - update class loader after stubbing: new session *** FAILED ***
   >   java.io.NotSerializableException: org.scalatest.Engine
   >   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
   >   at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
   >   at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
   >   at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
   >   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
   >   at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
   >   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
   >   at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
   >   at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
   >   at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
   >   ...
   > - update class loader after stubbing: same session *** FAILED ***
   >   java.io.NotSerializableException: org.scalatest.Engine
   >   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
   >   at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
   >   at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
   >   at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
   >   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
   >   at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
   >   at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
   >   at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
   >   at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
   >   at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
   >   ...
   > *** 10 TESTS FAILED ***
   > ```
   
   All test failures only occur with this pr, but this PR solves four test 
failures in `SparkSessionE2ESuite`
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to