marin-ma commented on issue #11555:
URL: 
https://github.com/apache/incubator-gluten/issues/11555#issuecomment-3843391453

   The root cause is `spark.gluten.sql.columnar.cudf` is not passed to the cudf 
plan validator. While after adding this fix, the application runs into another 
exception as below, and the fix is to add `spark.gluten.sql.columnar.cudf` into 
the native session conf list.
   
   ```
   26/02/03 19:26:07 WARN TaskSetManager: Lost task 2.0 in stage 40.0 (TID 
12316) (172.17.0.2 executor 2): org.apache.gluten.exception.GlutenException: 
org.apache.gluten.exception.GlutenException: Exception: VeloxRuntimeError
   Error Source: RUNTIME
   Error Code: INVALID_STATE
   Retriable: False
   Expression: cudfInput != nullptr
   Context: Operator: CudfHashJoinBuild[20] 1
   Function: addInput
   File: /velox/velox/experimental/cudf/exec/CudfHashJoin.cpp
   Line: 134
   Stack trace:
   # 0  _ZN8facebook5velox7process10StackTraceC1Ei
   # 1  
_ZN8facebook5velox14VeloxExceptionC1EPKcmS3_St17basic_string_viewIcSt11char_traitsIcEES7_S7_S7_bNS1_4TypeES7_
   # 2  
_ZN8facebook5velox6detail14veloxCheckFailINS0_17VeloxRuntimeErrorENS0_22CompileTimeEmptyStringEEEvRKNS1_18VeloxCheckFailArgsET0_
   # 3  
_ZN8facebook5velox10cudf_velox17CudfHashJoinBuild8addInputESt10shared_ptrINS0_9RowVectorEE
   # 4  
_ZZN8facebook5velox4exec6Driver11runInternalERSt10shared_ptrIS2_ERS3_INS1_13BlockingStateEERS3_INS0_9RowVectorEEENKUlvE3_clEv
   # 5  
_ZN8facebook5velox4exec6Driver11runInternalERSt10shared_ptrIS2_ERS3_INS1_13BlockingStateEERS3_INS0_9RowVectorEE
   # 6  
_ZN8facebook5velox4exec6Driver4nextEPN5folly10SemiFutureINS3_4UnitEEERPNS1_8OperatorERNS1_14BlockingReasonE
   # 7  _ZN8facebook5velox4exec4Task4nextEPN5folly10SemiFutureINS3_4UnitEEE
   # 8  _ZN6gluten24WholeStageResultIterator4nextEv
   # 9  Java_org_apache_gluten_vectorized_ColumnarBatchOutIterator_nativeHasNext
   # 10 0x00007f34a45845ba
    
   at 
org.apache.gluten.iterator.ClosableIterator.hasNext(ClosableIterator.java:38)
   at 
scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
   at 
org.apache.gluten.iterator.IteratorsV1$InvocationFlowProtection.hasNext(IteratorsV1.scala:154)
   at 
org.apache.gluten.iterator.IteratorsV1$IteratorCompleter.hasNext(IteratorsV1.scala:66)
   at 
org.apache.gluten.iterator.IteratorsV1$PayloadCloser.hasNext(IteratorsV1.scala:38)
   at 
org.apache.gluten.iterator.IteratorsV1$LifeTimeAccumulator.hasNext(IteratorsV1.scala:95)
   at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
   at 
org.apache.spark.shuffle.ColumnarShuffleWriter.internalWrite(ColumnarShuffleWriter.scala:133)
   at 
org.apache.spark.shuffle.ColumnarShuffleWriter.write(ColumnarShuffleWriter.scala:299)
   at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
   at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104)
   at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
   at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
   at org.apache.spark.scheduler.Task.run(Task.scala:141)
   at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
   at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
   at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
   at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
   at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
   at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
   at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
   at java.base/java.lang.Thread.run(Thread.java:840)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to