I think failover isn't enabled on regular Spark job framework, since we assume 
jobs are more ephemeral.

It could be a good setting to add to the Spark framework to enable failover.

Tim

> On Mar 30, 2017, at 10:18 AM, Yu Wei <yu20...@hotmail.com> wrote:
> 
> Hi guys,
> 
> I encountered a problem about spark on mesos.
> 
> I setup mesos cluster and launched spark framework on mesos successfully.
> 
> Then mesos master was killed and started again.
> 
> However, spark framework couldn't be re-registered again as mesos agent does. 
> I also couldn't find any error logs.
> 
> And MesosClusterDispatcher is still running there.
> 
> I suspect this is spark framework issue. 
> What's your opinion?
> 
> 
> Thanks,
> 
> Jared, (韦煜)
> Software developer
> Interested in open source software, big data, Linux

Reply via email to