[ https://issues.apache.org/jira/browse/SPARK-5124?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14349689#comment-14349689 ]
Marcelo Vanzin commented on SPARK-5124: --------------------------------------- re: a local Endpoint, if it's really necessary, it would probably be an implementation of {{RpcEnv}} (similar to how akka implements it). A lot of {{RpcEnv}} probably doesn't make a lot of sense for a local Endpoint, but those differences can either be abstracted away (e.g. having a more generic "Env" class from which {{RpcEnv}} extends, or having dummy implementations for things that don't make sense locally). Either way, it shouldn't be hard to do if it's really needed - but from your patch it seems like we can punt that to a different change. On a different note, I'm not a big fan of the {{RpcResponse}} name. It's not the response per se, but something that allows you to respond to an RPC. Something like {{RpcContext}} or {{RpcCallContext}} sounds more appropriate. I'm also a little puzzled by {{replyWithSender}}. Why is it needed? In my view of things, when the caller receives a reply to an RPC, it knows to whom the RPC was sent, so it doesn't seem necessary to have this method. Unless I misunderstood what it is. > Standardize internal RPC interface > ---------------------------------- > > Key: SPARK-5124 > URL: https://issues.apache.org/jira/browse/SPARK-5124 > Project: Spark > Issue Type: Sub-task > Components: Spark Core > Reporter: Reynold Xin > Assignee: Shixiong Zhu > Attachments: Pluggable RPC - draft 1.pdf, Pluggable RPC - draft 2.pdf > > > In Spark we use Akka as the RPC layer. It would be great if we can > standardize the internal RPC interface to facilitate testing. This will also > provide the foundation to try other RPC implementations in the future. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org