jatin510 opened a new issue, #1347:
URL: https://github.com/apache/datafusion-comet/issues/1347
### Describe the bug
while testing the array_repeat, getting error for following statement:
> checkSparkAnswerAndOperator(sql("SELECT array_repeat(_2, _4) from t1
where _4 is null"))
```
- array_repeat *** FAILED *** (5 seconds, 334 milliseconds)
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0
(TID 2) (192.168.1.20 executor driver): org.apache.comet.CometNativeException:
Compute error: concat requires input of at least one array
at org.apache.comet.Native.executePlan(Native Method)
at
org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1(CometExecIterator.scala:127)
at
org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1$adapted(CometExecIterator.scala:125)
at org.apache.comet.vector.NativeUtil.getNextBatch(NativeUtil.scala:157)
at
org.apache.comet.CometExecIterator.getNextBatch(CometExecIterator.scala:125)
at
org.apache.comet.CometExecIterator.hasNext(CometExecIterator.scala:146)
at
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.cometcolumnartorow_nextBatch_0$(Unknown
Source)
at
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown
Source)
at
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at
org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:760)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.util.Iterators$.size(Iterators.scala:29)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1953)
at org.apache.spark.rdd.RDD.$anonfun$count$1(RDD.scala:1269)
at org.apache.spark.rdd.RDD.$anonfun$count$1$adapted(RDD.scala:1269)
at
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92)
at
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
at org.apache.spark.scheduler.Task.run(Task.scala:139)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1529)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:842)
```
### Steps to reproduce
_No response_
### Expected behavior
It should run the test successfully, when trying to pass `null` in count
argument
### Additional context
_No response_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]