Have you looked a the Spark executor logs? They're usually located in the
$SPARK_HOME/work/ directory. If you're running in a cluster, they'll be on
the individual slave nodes. These should hopefully reveal more information.


On Mon, Nov 18, 2013 at 3:42 PM, Chris Grier <gr...@icsi.berkeley.edu>wrote:

> Hi,
>
> I'm trying to figure out what the problem is with a job that we are
> running on Spark 0.7.3. When we write out via saveAsTextFile we get an
> exception that doesn't reveal much:
>
> 13/11/18 15:06:19 INFO cluster.TaskSetManager: Loss was due to
> java.io.IOException
> java.io.IOException: Map failed
>         at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:849)
>         at spark.storage.DiskStore.getBytes(DiskStore.scala:86)
>         at spark.storage.DiskStore.getValues(DiskStore.scala:92)
>         at spark.storage.BlockManager.getLocal(BlockManager.scala:284)
>         at spark.storage.BlockFetcherIterator$$anonfun$
> 13.apply(BlockManager.scala:1027)
>         at spark.storage.BlockFetcherIterator$$anonfun$
> 13.apply(BlockManager.scala:1026)
>         at scala.collection.mutable.ResizableArray$class.foreach(
> ResizableArray.scala:60)
>         at scala.collection.mutable.ArrayBuffer.foreach(
> ArrayBuffer.scala:47)
>         at spark.storage.BlockFetcherIterator.<init>(
> BlockManager.scala:1026)
>         at spark.storage.BlockManager.getMultiple(BlockManager.scala:478)
>         at spark.BlockStoreShuffleFetcher.fetch(BlockStoreShuffleFetcher.
> scala:51)
>         at spark.BlockStoreShuffleFetcher.fetch(BlockStoreShuffleFetcher.
> scala:10)
>         at spark.rdd.CoGroupedRDD$$anonfun$compute$2.apply(
> CoGroupedRDD.scala:127)
>         at spark.rdd.CoGroupedRDD$$anonfun$compute$2.apply(
> CoGroupedRDD.scala:115)
>         at scala.collection.IndexedSeqOptimized$class.
> foreach(IndexedSeqOptimized.scala:34)
>         at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38)
>         at spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:115)
>         at spark.RDD.computeOrReadCheckpoint(RDD.scala:207)
>         at spark.RDD.iterator(RDD.scala:196)
>         at spark.MappedValuesRDD.compute(PairRDDFunctions.scala:704)
>         at spark.RDD.computeOrReadCheckpoint(RDD.scala:207)
>         at spark.RDD.iterator(RDD.scala:196)
>         at spark.FlatMappedValuesRDD.compute(PairRDDFunctions.scala:714)
>         at spark.RDD.computeOrReadCheckpoint(RDD.scala:207)
>         at spark.RDD.iterator(RDD.scala:196)
>         at spark.rdd.MappedRDD.compute(MappedRDD.scala:12)
>         at spark.RDD.computeOrReadCheckpoint(RDD.scala:207)
>         at spark.RDD.iterator(RDD.scala:196)
>         at spark.rdd.MappedRDD.compute(MappedRDD.scala:12)
>         at spark.RDD.computeOrReadCheckpoint(RDD.scala:207)
>         at spark.RDD.iterator(RDD.scala:196)
>         at spark.scheduler.ResultTask.run(ResultTask.scala:77)
>         at spark.executor.Executor$TaskRunner.run(Executor.scala:100)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1145)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:724)
>
> Any ideas?
>
> -Chris
>

Reply via email to