[ 
https://issues.apache.org/jira/browse/SPARK-1353?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14033625#comment-14033625
 ] 

Mridul Muralidharan commented on SPARK-1353:
--------------------------------------------

This is due to limitation in spark which is being addressed in 
https://issues.apache.org/jira/browse/SPARK-1476.

> IllegalArgumentException when writing to disk
> ---------------------------------------------
>
>                 Key: SPARK-1353
>                 URL: https://issues.apache.org/jira/browse/SPARK-1353
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager
>         Environment: AWS EMR 3.2.30-49.59.amzn1.x86_64 #1 SMP  x86_64 
> GNU/Linux
> Spark 1.0.0-SNAPSHOT built for Hadoop 1.0.4 built 2014-03-18
>            Reporter: Jim Blomo
>            Priority: Minor
>
> The Executor may fail when trying to mmap a file bigger than 
> Integer.MAX_VALUE due to the constraints of FileChannel.map 
> (http://docs.oracle.com/javase/7/docs/api/java/nio/channels/FileChannel.html#map(java.nio.channels.FileChannel.MapMode,
>  long, long)).  The signature takes longs, but the size value must be less 
> than MAX_VALUE.  This manifests with the following backtrace:
> java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
>         at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:828)
>         at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:98)
>         at 
> org.apache.spark.storage.BlockManager.doGetLocal(BlockManager.scala:337)
>         at 
> org.apache.spark.storage.BlockManager.getLocal(BlockManager.scala:281)
>         at org.apache.spark.storage.BlockManager.get(BlockManager.scala:430)
>         at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:38)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:220)
>         at 
> org.apache.spark.api.python.PythonRDD$$anon$2.run(PythonRDD.scala:85)



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to