[ 
https://issues.apache.org/jira/browse/SPARK-5124?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14335204#comment-14335204
 ] 

Marcelo Vanzin commented on SPARK-5124:
---------------------------------------

Ah, also, another comment before I forget. Scala's syntax sometimes throws me 
off, but it seems that in the current implementation of your RPC interfaces, 
you're following Akka's semantics for the {{receive()}} method. That is, is 
sort of assumes a single-threaded message queue, since otherwise you can't 
really have {{ask()}} without running into concurrency issues.

I'd suggest a different approach that allows more standard RPC semantics. For 
example, have the {{receive()}} method take an {{RpcMessage}} as the argument; 
this message contains information about the sender, what type of message it is, 
methods that allow you to reply to a message if it needs a reply, and the 
actual method payload.

That way, you can do the usual matching, and you can easily do asynchronous (or 
concurrent) processing and still be able to reply to the right message. This 
should easily map to akka's backend implementation, and allow a new RPC 
implementation to be more flexible.

> Standardize internal RPC interface
> ----------------------------------
>
>                 Key: SPARK-5124
>                 URL: https://issues.apache.org/jira/browse/SPARK-5124
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>            Reporter: Reynold Xin
>            Assignee: Shixiong Zhu
>         Attachments: Pluggable RPC - draft 1.pdf, Pluggable RPC - draft 2.pdf
>
>
> In Spark we use Akka as the RPC layer. It would be great if we can 
> standardize the internal RPC interface to facilitate testing. This will also 
> provide the foundation to try other RPC implementations in the future.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to