Thanks!

If I have a cluster more than one node (standalone or YRAN), can I stop and 
start any single node among them and keep the job running?

Best regards/祝好,

Chang Liu 刘畅


> On 7 Nov 2018, at 16:17, 秦超峰 <18637156...@163.com> wrote:
> 
> the second
> 
> 
> 
>       
> 秦超峰
> 邮箱:windyqinchaof...@163.com
>  
> <https://maas.mail.163.com/dashi-web-extend/html/proSignature.html?ftlId=1&name=%E7%A7%A6%E8%B6%85%E5%B3%B0&uid=example%40163.com&iconUrl=https%3A%2F%2Fmail-online.nosdn.127.net%2Fqiyelogo%2FdefaultAvatar.png&items=%5B%22%E9%82%AE%E7%AE%B1%EF%BC%9Awindyqinchaofeng%40163.com%22%5D>签名由
>  网易邮箱大师 <https://mail.163.com/dashi/dlpro.html?from=mail88> 定制
> 
> On 11/07/2018 17:14, Chang Liu <mailto:fluency...@gmail.com> wrote:
> Hi,
> 
> I have a question regarding whether the current running job will restart if I 
> stop and start the flink cluster?
> 
> 1. Let’s say I am just having a Standalone one node cluster.
> 2. I have several Flink jobs already running on the cluster.
> 3. If I do a bin/cluster-stop.sh and then do a bin/cluster-start.sh, will be 
> previously running job restart again?
> 
> OR
> 
> Before I do bin/cluster-stop.sh, I have to do Savepoints for each of the job.
> After bin/cluster-start.sh is finished, I have to do Start Job based on 
> Savepoints triggered before for each of the job I want to restart.
> 
> Many thanks in advance :)
> 
> Best regards/祝好,
> 
> Chang Liu 刘畅
> 
> 

Reply via email to