I think it's really a mortal blow to livy about the repl scene . What I can do , I think is to monitoring spark metrics, when the driver's memory was used to a high leve I will isolate the session.
2018-11-14 lk_hadoop 发件人:"Harsch, Tim" <tim.har...@teradata.com> 发送时间:2018-11-14 05:52 主题:Re: about LIVY-424 收件人:"user"<user@livy.incubator.apache.org> 抄送: While it's true LIVY-424 creates a session leak due to REPL leak in Scala it's not the only thing that can. I've run hundreds of simple scala commands and the leak is only mild/moderate. However, some scala commands can be really problematic. For instance import org.apache.spark.sql._ run this import repeatedly and within only 10s of executions your sessions performance will degrade and eventually run out of memory. From: lk_hadoop <lk_had...@163.com> Sent: Sunday, November 11, 2018 5:37:34 PM To: user Subject: about LIVY-424 [External Email] hi,all: I meet this issue https://issues.apache.org/jira/browse/LIVY-424 , anybody know how to resolve it? 2018-11-12 lk_hadoop