Both ahead of time, or just in time. Just like a normal Spark closure.

On Fri, Aug 31, 2018 at 10:18 AM Nihar Sheth <nsh...@cloudera.com> wrote:

> Hi @rxin,
>
> Just to make sure I understand your desired use case, are you suggesting a
> way (for the driver) to call, at any given time, a general method that can
> be defined ahead of time on the executors?
>
> On Thu, Aug 30, 2018 at 11:11 PM, Reynold Xin <r...@databricks.com> wrote:
>
>> I actually had a similar use case a while ago, but not entirely the same.
>> In my use case, Spark is already up, but I want to make sure all existing
>> (and new) executors run some specific code. Can we update the API to
>> support that? I think that's doable if we split the design into two: one is
>> the ability to do what I just mentioned, and second is the ability to
>> register via config class when Spark starts to run the code.
>>
>>
>> On Thu, Aug 30, 2018 at 11:01 PM Felix Cheung <felixcheun...@hotmail.com>
>> wrote:
>>
>>> +1
>>> ------------------------------
>>> *From:* Mridul Muralidharan <mri...@gmail.com>
>>> *Sent:* Wednesday, August 29, 2018 1:27:27 PM
>>> *To:* dev@spark.apache.org
>>> *Subject:* Re: SPIP: Executor Plugin (SPARK-24918)
>>>
>>> +1
>>> I left a couple of comments in NiharS's PR, but this is very useful to
>>> have in spark !
>>>
>>> Regards,
>>> Mridul
>>> On Fri, Aug 3, 2018 at 10:00 AM Imran Rashid
>>> <iras...@cloudera.com.invalid> wrote:
>>> >
>>> > I'd like to propose adding a plugin api for Executors, primarily for
>>> instrumentation and debugging (
>>> https://issues.apache.org/jira/browse/SPARK-24918).  The changes are
>>> small, but as its adding a new api, it might be spip-worthy.  I mentioned
>>> it as well in a recent email I sent about memory monitoring
>>> >
>>> > The spip proposal is here (and attached to the jira as well):
>>> https://docs.google.com/document/d/1a20gHGMyRbCM8aicvq4LhWfQmoA5cbHBQtyqIA2hgtc/edit?usp=sharing
>>> >
>>> > There are already some comments on the jira and pr, and I hope to get
>>> more thoughts and opinions on it.
>>> >
>>> > thanks,
>>> > Imran
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>
>

Reply via email to