IOException running streaming job

2014-09-23 Thread Emil Gustafsson
I'm trying out some streaming with spark and I'm getting an error that
puzzles me since I'm new to Spark. I get this error all the time but 1-2
batches in the stream are processed before the job stops. but never the
complete job and often no batch is processed at all. I use Spark 1.1.0.

The job is started with --master local[4].
The job is doing this:
val conf = new SparkConf()
.setAppName(My Application)

val sc = new SparkContext(conf)
val ssc = new StreamingContext(conf, Seconds(2))
val queue = new SynchronizedQueue[RDD[(Int, String)]]()
val input = ssc.queueStream(queue)
//val mapped = input.map(_._2)

input.print()
ssc.start()

var last = 0
for (i - 1 to 5) {
Thread.sleep(1000)
if (i != 2) {
//val casRdd = cr.where(id = 42 and t  %d and t =
%d.format(last, i))
var l = List[(Int, String)]()
for (j - last + 1 to i) {
l = l :+ (j, foo%d.format(i))
}
l.foreach(x = println(*** %s.format(x)))
val casRdd = sc.parallelize(l)
//casRdd.foreach(println)
last = i
queue += casRdd
}
}

Thread.sleep(1000)
ssc.stop()


The error stack I get is:
14/09/24 00:08:56 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.io.IOException: PARSING_ERROR(2)
at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:84)
at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:594)
at
org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
at
org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
at
org.xerial.snappy.SnappyInputStream.init(SnappyInputStream.java:58)
at
org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
at
org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:216)
at
org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
14/09/24 00:08:56 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
localhost): java.io.IOException: PARSING_ERROR(2)
org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:84)
org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:594)

org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)

org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)

org.xerial.snappy.SnappyInputStream.init(SnappyInputStream.java:58)

org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)

org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:216)

org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:170)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:606)

java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)


AbstractMethodError when creating cassandraTable object

2014-09-18 Thread Emil Gustafsson
pretty sure this is a result on me being new to both scala, spark and sbt
but I'm getting the error above when I try to use the cassandra driver for
spark.
I posted more information here:
https://github.com/datastax/spark-cassandra-connector/issues/245
Ideas?
/E