jackyjfhu created SPARK-47091:
---------------------------------

             Summary: An error occurs when executing the pyspark program
                 Key: SPARK-47091
                 URL: https://issues.apache.org/jira/browse/SPARK-47091
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.5.0, 3.1.3
            Reporter: jackyjfhu


When I excute this code via pyspark:

spark._sc.textFile("/tmp/spark_data1").repartition(50).toDF().show

I get an error:

ERROR spark.TaskContextImpl: Error in TaskCompletionListener 
io.netty.util.IllegalReferenceCountException: refCnt: 0, decrement: 1 at 
io.netty.util.internal.ReferenceCountUpdater.toLiveRealRefCnt(ReferenceCountUpdater.java:74)
 ~[iceberg-spark-runtime-3.1_2.12-0.14.3-5-tencent.jar:?] at 
io.netty.util.internal.ReferenceCountUpdater.release(ReferenceCountUpdater.java:138)
 ~[iceberg-spark-runtime-3.1_2.12-0.14.3-5-tencent.jar:?] at 
io.netty.buffer.AbstractReferenceCountedByteBuf.release(AbstractReferenceCountedByteBuf.java:100)
 ~[netty-all-4.1.51.Final.jar:4.1.51.Final] at 
io.netty.buffer.AbstractDerivedByteBuf.release0(AbstractDerivedByteBuf.java:94) 
~[netty-all-4.1.51.Final.jar:4.1.51.Final] at 
io.netty.buffer.AbstractDerivedByteBuf.release(AbstractDerivedByteBuf.java:90) 
~[netty-all-4.1.51.Final.jar:4.1.51.Final] at 
org.apache.spark.network.buffer.NettyManagedBuffer.release(NettyManagedBuffer.java:62)
 ~[spark-network-common_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.storage.ShuffleBlockFetcherIterator.cleanup(ShuffleBlockFetcherIterator.scala:226)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.storage.ShuffleFetchCompletionListener.onTaskCompletion(ShuffleBlockFetcherIterator.scala:862)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.TaskContextImpl.$anonfun$markTaskCompleted$1(TaskContextImpl.scala:124)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.TaskContextImpl.$anonfun$markTaskCompleted$1$adapted(TaskContextImpl.scala:124)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.TaskContextImpl.$anonfun$invokeListeners$1(TaskContextImpl.scala:137)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.TaskContextImpl.$anonfun$invokeListeners$1$adapted(TaskContextImpl.scala:135)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) 
~[scala-library-2.12.10.jar:?] at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) 
~[scala-library-2.12.10.jar:?] at 
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) 
~[scala-library-2.12.10.jar:?] at 
org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:135) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:124) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.scheduler.Task.run(Task.scala:147) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:498)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439) 
[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:501) 
[spark-core_2.12-3.1.3.jar:3.1.3] at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_362] at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_362] at java.lang.Thread.run(Thread.java:750) [?:1.8.0_362] 24/02/19 
11:26:53 ERROR executor.Executor: Exception in task 0.1 in stage 1.0 (TID 4001) 
org.apache.spark.util.TaskCompletionListenerException: refCnt: 0, decrement: 1 
at org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:145) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:124) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.scheduler.Task.run(Task.scala:147) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:498)
 ~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439) 
~[spark-core_2.12-3.1.3.jar:3.1.3] at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:501) 
[spark-core_2.12-3.1.3.jar:3.1.3] at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_362] at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_362] at java.lang.Thread.run(Thread.java:750) 

 
Note: There are 4000 small files in this directory /tmp/spark_data1;

In addition, if there are relatively few small files, no error will be reported.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to