[ https://issues.apache.org/jira/browse/SPARK-14261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15270290#comment-15270290 ]
Oleg Danilov commented on SPARK-14261: -------------------------------------- I've reproduced this issue as following: test.sql {code} add jar nexr-hive-udf-0.2-SNAPSHOT.jar; CREATE TEMPORARY FUNCTION decoder AS 'com.nexr.platform.hive.udf.GenericUDFDecode' ; {code} test.sh {code} for i in {1..600}; do beeline -u jdbc:hive2://u1:10000 -f test.sql; done {code} After that I found that the following HashMap in the CommandProcessorFactory contains 600 HiveConf instances preventing them from gc: {code} private static final Map<HiveConf, Driver> mapDrivers = Collections.synchronizedMap(new HashMap<HiveConf, Driver>()); {code} Seems like ClientWrapper should remove these instances. Something like: {code} // Throw an exception if there is an error in query processing. if (response.getResponseCode != 0) { + CommandProcessorFactory.clean(conf) driver.close() throw new QueryExecutionException(response.getErrorMessage) } driver.setMaxRows(maxRows) val results = shim.getDriverResults(driver) + CommandProcessorFactory.clean(conf) driver.close() results {code} > Memory leak in Spark Thrift Server > ---------------------------------- > > Key: SPARK-14261 > URL: https://issues.apache.org/jira/browse/SPARK-14261 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.6.0 > Reporter: Xiaochun Liang > Attachments: 16716_heapdump_64g.PNG, 16716_heapdump_80g.PNG, > MemorySnapshot.PNG > > > I am running Spark Thrift server on Windows Server 2012. The Spark Thrift > server is launched as Yarn client mode. Its memory usage is increased > gradually with the queries in. I am wondering there is memory leak in Spark > Thrift server. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org