andygrove opened a new issue, #2713: URL: https://github.com/apache/datafusion-comet/issues/2713
### Describe the bug Spark SQL test failure when I enable `spark.comet.sparkToColumnar.enabled`. ``` [info] - SPARK-7595: Window will cause resolve failed with self join *** FAILED *** (116 milliseconds) [info] org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 158.0 failed 1 times, most recent failure: Lost task 0.0 in stage 158.0 (TID 108) (10.0.0.118 executor driver): org.apache.comet.CometNativeException: External error: Arrow error: Invalid argument error: must either specify a row count or at least one column [info] at org.apache.comet.Native.executePlan(Native Method) [info] at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$2(CometExecIterator.scala:151) [info] at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$2$adapted(CometExecIterator.scala:150) [info] at org.apache.comet.vector.NativeUtil.getNextBatch(NativeUtil.scala:212) [info] at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1(CometExecIterator.scala:150) [info] at org.apache.comet.Tracing$.withTrace(Tracing.scala:31) [info] at org.apache.comet.CometExecIterator.getNextBatch(CometExecIterator.scala:148) [info] at org.apache.comet.CometExecIterator.hasNext(CometExecIterator.scala:204) [info] at org.apache.spark.sql.comet.execution.shuffle.CometNativeShuffleWriter.write(CometNativeShuffleWriter.scala:110) [info] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) ``` ### Steps to reproduce _No response_ ### Expected behavior _No response_ ### Additional context _No response_ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
