Adding dev list On Jul 13, 2016 5:38 PM, "Noorul Islam K M" <noo...@noorul.com> wrote:
> > Spark version: 1.6.1 > Cluster Manager: Standalone > > I am experimenting with cluster mode deployment along with supervise for > high availability of streaming applications. > > 1. Submit a streaming job in cluster mode with supervise > 2. Say that driver is scheduled on worker1. The app started > successfully. > 3. Kill worker1 java process. This does not kill driver process and > hence the application (context) is still alive. > 4. Because of supervise flag, driver gets scheduled to new worker > worker2 and hence a new context is created, making it a duplicate. > > I think this seems to be a bug. > > Regards, > Noorul >