[ 
https://issues.apache.org/jira/browse/SPARK-25776?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

liuxian updated SPARK-25776:
----------------------------
    Description: 
In {color:#205081}{{UnsafeSorterSpillWriter.java}}{color}, when we write a 
record to a spill file wtih {{ {color:#205081}void write(Object baseObject, 
long baseOffset, int recordLength, long keyPrefix{color})}}, 
{color:#205081}{{recordLength}} {color}and {color:#205081}{{keyPrefix}} 
{color}will be written the disk write buffer first, and these will take 12 
bytes, so the disk write buffer size must be greater than 12.

If {{diskWriteBufferSize}} is 10, it will print this exception info:

_java.lang.ArrayIndexOutOfBoundsException: 10_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer
 (UnsafeSorterSpillWriter.java:91)_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
 _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_

  was:
In {{UnsafeSorterSpillWriter.java}}, when we write a record to a spill file 
wtih {{ void write(Object baseObject, long baseOffset, int recordLength, long 
keyPrefix)}}, {{recordLength}} and {{keyPrefix}} will be written the disk write 
buffer first, and these will take 12 bytes, so the disk write buffer size must 
be greater than 12.

If {{diskWriteBufferSize}} is 10, it will print this exception info:

_java.lang.ArrayIndexOutOfBoundsException: 10_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer
 (UnsafeSorterSpillWriter.java:91)_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
 _at 
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
 _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_


> The disk write buffer size must be greater than 12.
> ---------------------------------------------------
>
>                 Key: SPARK-25776
>                 URL: https://issues.apache.org/jira/browse/SPARK-25776
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: liuxian
>            Priority: Minor
>
> In {color:#205081}{{UnsafeSorterSpillWriter.java}}{color}, when we write a 
> record to a spill file wtih {{ {color:#205081}void write(Object baseObject, 
> long baseOffset, int recordLength, long keyPrefix{color})}}, 
> {color:#205081}{{recordLength}} {color}and {color:#205081}{{keyPrefix}} 
> {color}will be written the disk write buffer first, and these will take 12 
> bytes, so the disk write buffer size must be greater than 12.
> If {{diskWriteBufferSize}} is 10, it will print this exception info:
> _java.lang.ArrayIndexOutOfBoundsException: 10_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer
>  (UnsafeSorterSpillWriter.java:91)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
>  _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to