Hi, all
Currently we are exploring feature of zeppelin, now the situation we are using YARN to manage spark jobs. In terms of the experiments, we conclude that one interpreter is corresponding to An application in YARN cluster, that means all the notebooks from zeppelin with same interpreter go through single Application in YARN. Also we found if out code shuts down the application in YARN, then any notebooks fail to run after this point - error like this “can’t call a stop spark context …..” . The only solution for this is to restart the interpreter. How to get around this without restart the interpreter? Thanks AL