[ 
https://issues.apache.org/jira/browse/SPARK-5124?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14337028#comment-14337028
 ] 

Reynold Xin commented on SPARK-5124:
------------------------------------

I went and looked at the various use cases of Akka in the code base. Quite a 
few of them actually also use askWithReply. There are really two categories of 
message deliveries: one that the sender expects a reply, and one that doesn't. 
As a result, I think the following interface would make more sense:

{code}
// receive side:
def receive(msg: RpcMessage)
def receiveAndReply(msg: RpcMessage) = sys.err("...")

// send side:
def send(message: Object): Unit
def sendWithReply[T](message: Object): Future[T]
{code}

There is an obvious pairing here. Messages sent via "send" goes to "receive", 
without requiring an ack (although if we use the transport layer, an implicit 
ack can be sent). Messages sent via "sendWithReply" goes to "receiveAndReply", 
and the caller needs to explicitly handle the reply. 

In most cases, an rpc endpoint only needs to receive messages without reply, 
and thus can override just receive. 




> Standardize internal RPC interface
> ----------------------------------
>
>                 Key: SPARK-5124
>                 URL: https://issues.apache.org/jira/browse/SPARK-5124
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>            Reporter: Reynold Xin
>            Assignee: Shixiong Zhu
>         Attachments: Pluggable RPC - draft 1.pdf, Pluggable RPC - draft 2.pdf
>
>
> In Spark we use Akka as the RPC layer. It would be great if we can 
> standardize the internal RPC interface to facilitate testing. This will also 
> provide the foundation to try other RPC implementations in the future.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to