Hi,

I'm trying to create a Spark Streaming actor stream but I'm having several
problems. First of all the guide from
https://spark.apache.org/docs/latest/streaming-custom-receivers.html refers
to the code
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala,
which uses AkkaUtils and org.apache.spark.SecurityManager which are now
private[spark]. So I've tried with the example from
http://www.typesafe.com/activator/template/spark-streaming-scala-akka, but
I get the following exception as soon as I store some data in Spark
Streaming

org.apache.spark.SparkDriverExecutionException: Execution error
at
org.apache.spark.scheduler.DAGScheduler.handleTaskCompletion(DAGScheduler.scala:1025)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Caused by: java.lang.ArrayStoreException: [Ljava.lang.Object;
at scala.runtime.ScalaRunTime$.array_update(ScalaRunTime.scala:88)
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1750)
at
org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1750)
at org.apache.spark.scheduler.JobWaiter.taskSucceeded(JobWaiter.scala:56)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskCompletion(DAGScheduler.scala:1021)
... 3 more

My code is basically the same as in that example, and it is available at
https://gist.github.com/juanrh/139af20fd2060cb1a9d1 . If I comment
receiverActor
! msg then there is no exception, but also no data is received in the
stream. Any thoughts on this?

Thanks a lot for you help.

Greetings,

Juan

Reply via email to