I need to have my own scheduler to point to a proprietary remote execution framework.
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L2152 I'm looking at where it decides on the backend and it doesn't look like there is a hook. Of course I can extend sparkContext and add my own, but that seems sort of lame. Wondering if people know of a better way (or maybe I'm just missing something obvious)