[ https://issues.apache.org/jira/browse/SPARK-12786?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15101918#comment-15101918 ]
Brian London edited comment on SPARK-12786 at 1/15/16 3:29 PM: --------------------------------------------------------------- Yeah, exactly. Because of the use of {{AkkaUtil}} the settings needed to create a minimal actor system that can communicate with the actor stream is buried in the Spark code. I believe the change at https://github.com/apache/spark/pull/10744/files#diff-690ab3eacd0a42fe7bee1d29c5910ffdR111 will resolve this issue. On a side note, are there plans to include classes to use actors as a DStream output as well? was (Author: brianlondon): Yeah, exactly. Because of the use of `AkkaUtil` the settings needed to create a minimal actor system that can communicate with the actor stream is buried in the Spark code. I believe the change at https://github.com/apache/spark/pull/10744/files#diff-690ab3eacd0a42fe7bee1d29c5910ffdR111 will resolve this issue. On a side note, are there plans to include classes to use actors as a DStream output as well? > Actor demo does not demonstrate usable code > ------------------------------------------- > > Key: SPARK-12786 > URL: https://issues.apache.org/jira/browse/SPARK-12786 > Project: Spark > Issue Type: Documentation > Components: Streaming > Affects Versions: 1.5.2, 1.6.0 > Reporter: Brian London > Priority: Minor > > The ActorWordCount demo doesn't show how to set up an actor based dstream in > a way that can be used. > The demo relies on the {{AkkaUtils}} object, which is marked private[spark]. > Thus the code presented will not compile unless users declare their code to > be in the org.apache.spark package. > Demo is located at > https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org