I've been trying to figure out how to increase the heap space for my spark
environment in 1.0.0, and all of the things I've found tell me I have export
something in Java Opts, which is deprecated in 1.0.0, or in increase the
spark.executor.memory, which is at 6G. I'm only trying to process about
400-500 mB of text, but I get this error whenever I try to collect:

14/06/17 11:00:21 INFO MapOutputTrackerMasterActor: Asked to send map output
locations for shuffle 0 to sp...@salinger.ornl.gov:50251
14/06/17 11:00:21 INFO MapOutputTrackerMaster: Size of output statuses for
shuffle 0 is 165 bytes
14/06/17 11:00:35 INFO BlockManagerInfo: Added taskresult_14 in memory on
salinger.ornl.gov:50253 (size: 123.7 MB, free: 465.1 MB)
14/06/17 11:00:35 INFO BlockManagerInfo: Added taskresult_13 in memory on
salinger.ornl.gov:50253 (size: 127.7 MB, free: 337.4 MB)
14/06/17 11:00:36 ERROR Utils: Uncaught exception in thread Result resolver
thread-2
java.lang.OutOfMemoryError: Java heap space
        at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:39)
        at java.nio.ByteBuffer.allocate(ByteBuffer.java:312)
        at org.apache.spark.storage.BlockMessage.set(BlockMessage.scala:94)
        at
org.apache.spark.storage.BlockMessage$.fromByteBuffer(BlockMessage.scala:176)
        at
org.apache.spark.storage.BlockMessageArray.set(BlockMessageArray.scala:63)
        at
org.apache.spark.storage.BlockMessageArray$.fromBufferMessage(BlockMessageArray.scala:109)
        at
org.apache.spark.storage.BlockManagerWorker$.syncGetBlock(BlockManagerWorker.scala:128)
        at
org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:489)
        at
org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:487)
        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at
org.apache.spark.storage.BlockManager.doGetRemote(BlockManager.scala:487)
        at
org.apache.spark.storage.BlockManager.getRemoteBytes(BlockManager.scala:481)
        at
org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply$mcV$sp(TaskResultGetter.scala:53)
        at
org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply(TaskResultGetter.scala:47)
        at
org.apache.spark.scheduler.TaskResultGetter$$anon$2$$anonfun$run$1.apply(TaskResultGetter.scala:47)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at
org.apache.spark.scheduler.TaskResultGetter$$anon$2.run(TaskResultGetter.scala:46)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:695)

Any idea how to fix heap space errors in 1.0.0?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-0-0-java-lang-outOfMemoryError-Java-Heap-Space-tp7733.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to