I'm running into a new issue with Snappy causing a crash (using Spark
1.2.0). Did anyone see this before?
-Sven

2015-01-28 16:09:35,448 WARN  [shuffle-server-1] storage.MemoryStore
(Logging.scala:logWarning(71)) - Failed to reserve initial memory
threshold of 1024.0 KB for computing block rdd_45_14 in memory.
2015-01-28 16:09:35,449 WARN  [shuffle-server-1] storage.MemoryStore
(Logging.scala:logWarning(71)) - Not enough space to cache rdd_45_14
in memory! (computed 504.0 B so far)
2015-01-28 16:09:35,450 INFO  [shuffle-server-1] storage.MemoryStore
(Logging.scala:logInfo(59)) - Memory use = 1238.4 MB (blocks) + 973.8
MB (scratch space shared across 4 thread(s)) = 2.2 GB. Storage limit =
2.2 GB.
2015-01-28 16:09:35,452 WARN  [shuffle-server-1] storage.MemoryStore
(Logging.scala:logWarning(71)) - Persisting block rdd_45_14 to disk
instead.
2015-01-28 16:09:35,472 ERROR [stdout writer for python] util.Utils
(Logging.scala:logError(96)) - Uncaught exception in thread stdout
writer for python
java.lang.InternalError: a fault occurred in a recent unsafe memory
access operation in compiled Java code
        at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        at 
org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:351)
        at 
org.xerial.snappy.SnappyInputStream.rawRead(SnappyInputStream.java:159)
        at org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:142)
        at 
java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2310)
        at 
java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:2712)
        at 
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2742)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1687)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
        at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
        at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
        at scala.collection.Iterator$class.foreach(Iterator.scala:727)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        at 
org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:383)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$1.apply$mcV$sp(PythonRDD.scala:242)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$1.apply(PythonRDD.scala:204)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$1.apply(PythonRDD.scala:204)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread.run(PythonRDD.scala:203)
2015-01-28 16:09:35,478 ERROR [stdout writer for python]
util.SparkUncaughtExceptionHandler (Logging.scala:logError(96)) -
Uncaught exception in thread Thread[stdout writer for python,5,main]
java.lang.InternalError: a fault occurred in a recent unsafe memory
access operation in compiled Java code
        at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
        at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
        at 
org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:351)
        at 
org.xerial.snappy.SnappyInputStream.rawRead(SnappyInputStream.java:159)
        at org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:142)
        at 
java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2310)
        at 
java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:2712)
        at 
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2742)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1687)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
        at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
        at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
        at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
        at scala.collection.Iterator$class.foreach(Iterator.scala:727)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        at 
org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:383)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$1.apply$mcV$sp(PythonRDD.scala:242)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$1.apply(PythonRDD.scala:204)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$1.apply(PythonRDD.scala:204)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460)
        at 
org.apache.spark.api.python.PythonRDD$WriterThread.run(PythonRDD.scala:203)



-- 
http://sites.google.com/site/krasser/?utm_source=sig

Reply via email to