rror: Size exceeds Integer.MAX_VALUE.
>
> should the valid range of length be changed to Long.MAX_VALUE?
>
>
> Error: org.apache.spark.SparkException: Job aborted due to stage failure:
> Task 7 in stage 2.0 failed 4 times, most recent failure: Lost task 7.3 in
> stage 2.0 (TID 1
ilure: Lost task 7.3 in
stage 2.0 (TID 11877, 10.9.*.*): java.lang.RuntimeException:
java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:828)
at
org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:125)
at
org.
Yifan LI
>>
>>
>>
>>
>>
>> On 29 Oct 2015, at 12:52, Yifan LI <iamyifa...@gmail.com> wrote:
>>
>> Hey,
>>
>> I was just trying to scan a large RDD sortedRdd, ~1billion elements,
>> using toLocalIterator api, but an exception return
Hey,
I was just trying to scan a large RDD sortedRdd, ~1billion elements, using
toLocalIterator api, but an exception returned as it was almost finished:
java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds
Integer.MAX_VALUE
at sun.nio.ch.FileChannelImpl.map
rned as it was almost finished:
>
> java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds
> Integer.MAX_VALUE
> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:821)
> at
> org.apache.spark.storage.DiskSto
>
> Hey,
>
> I was just trying to scan a large RDD sortedRdd, ~1billion elements, using
> toLocalIterator api, but an exception returned as it was almost finished:
>
> java.lang.RuntimeException: java.lang.IllegalArgumentException: Size
> exceeds Integer.MAX_VALUE
> at
Really? What should we make of this?
24 Sep 2014 10:03:36,772 ERROR [Executor task launch worker-52] Executor -
Exception in task ID 40599
java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:789
java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:789)
at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:108)
at
org.apache.spark.storage.BlockManager.doGetLocal(BlockManager.scala:415