[ https://issues.apache.org/jira/browse/SPARK-22575?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16348548#comment-16348548 ]
Marco Gaido edited comment on SPARK-22575 at 2/1/18 1:06 PM: ------------------------------------------------------------- I am not able to reproduce the issue. May I ask you to provide an easy way to reproduce it? I have run thousands of queries without any issue. If not, can you try and add {code} log4j.logger.org.apache.spark.storage=DEBUG {code} to your log4j.properties and share the logs? Thanks. was (Author: mgaido): I am not able to reproduce the issue. May I ask you to provide an easy way to reproduce it? I have run thousands of queries without any issue. If not, can you try and add ``` log4j.logger.org.apache.spark.storage=DEBUG ``` to your log4j.properties and share the logs? Thanks. > Making Spark Thrift Server clean up its cache > --------------------------------------------- > > Key: SPARK-22575 > URL: https://issues.apache.org/jira/browse/SPARK-22575 > Project: Spark > Issue Type: Improvement > Components: Block Manager, SQL > Affects Versions: 2.2.0 > Reporter: Oz Ben-Ami > Priority: Minor > Labels: cache, dataproc, thrift, yarn > > Currently, Spark Thrift Server accumulates data in its appcache, even for old > queries. This fills up the disk (using over 100GB per worker node) within > days, and the only way to clear it is to restart the Thrift Server > application. Even deleting the files directly isn't a solution, as Spark then > complains about FileNotFound. > I asked about this on [Stack > Overflow|https://stackoverflow.com/questions/46893123/how-can-i-make-spark-thrift-server-clean-up-its-cache] > a few weeks ago, but it does not seem to be currently doable by > configuration. > Am I missing some configuration option, or some other factor here? > Otherwise, can anyone point me to the code that handles this, so maybe I can > try my hand at a fix? > Thanks! -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org