Xuefu Zhang created HIVE-9204:
---------------------------------

             Summary: Test windowing.q is failing [Spark Branch]
                 Key: HIVE-9204
                 URL: https://issues.apache.org/jira/browse/HIVE-9204
             Project: Hive
          Issue Type: Sub-task
          Components: Spark
            Reporter: Xuefu Zhang


Error seen in spark.log:
{code}
2014-12-23 13:56:09,654 INFO  [Executor task launch worker-1]: exec.PTFOperator 
(Operator.java:close(595)) - 153 finished. closing...
2014-12-23 13:56:09,654 ERROR [Executor task launch worker-1]: 
spark.SparkReduceRecordHandler (SparkReduceRecordHandler.java:close(446)) - Hit 
error while closing operators - failing tree
2014-12-23 13:56:09,654 ERROR [Executor task launch worker-1]: 
executor.Executor (Logging.scala:logError(96)) - Exception in task 1.3 in stage 
27.0 (TID 64)
java.lang.RuntimeException: Hive Runtime Error while closing operators: null
        at 
org.apache.hadoop.hive.ql.exec.spark.SparkReduceRecordHandler.close(SparkReduceRecordHandler.java:447)
        at 
org.apache.hadoop.hive.ql.exec.spark.HiveReduceFunctionResultList.closeRecordProcessor(HiveReduceFunctionResultList.java:58)
        at 
org.apache.hadoop.hive.ql.exec.spark.HiveBaseFunctionResultList$ResultIterator.hasNext(HiveBaseFunctionResultList.java:108)
        at 
scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:41)
        at scala.collection.Iterator$class.foreach(Iterator.scala:727)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        at 
org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$2.apply(AsyncRDDActions.scala:115)
        at 
org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$2.apply(AsyncRDDActions.scala:115)
        at 
org.apache.spark.SparkContext$$anonfun$30.apply(SparkContext.scala:1390)
        at 
org.apache.spark.SparkContext$$anonfun$30.apply(SparkContext.scala:1390)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:56)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayDeque.getFirst(ArrayDeque.java:318)
        at 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFFirstValue$FirstValStreamingFixedWindow.terminate(GenericUDAFFirstValue.java:290)
        at 
org.apache.hadoop.hive.ql.udf.ptf.WindowingTableFunction.finishPartition(WindowingTableFunction.java:413)
        at 
org.apache.hadoop.hive.ql.exec.PTFOperator$PTFInvocation.finishPartition(PTFOperator.java:337)
        at 
org.apache.hadoop.hive.ql.exec.PTFOperator.closeOp(PTFOperator.java:95)
        at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:598)
        at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
        at 
org.apache.hadoop.hive.ql.exec.spark.SparkReduceRecordHandler.close(SparkReduceRecordHandler.java:432)
        ... 15 more
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to