[ 
https://issues.apache.org/jira/browse/SPARK-3875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17044467#comment-17044467
 ] 

Shyam commented on SPARK-3875:
------------------------------

[~srowen] ,  I am still getting this error ,  its suggested this is the fix 
https://issues.apache.org/jira/browse/SPARK-26825 but how to implement it in my 
spark-streaming application?

> Add TEMP DIRECTORY configuration
> --------------------------------
>
>                 Key: SPARK-3875
>                 URL: https://issues.apache.org/jira/browse/SPARK-3875
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: Patrick Liu
>            Priority: Major
>
> Currently, the Spark uses "java.io.tmpdir" to find the /tmp/ directory.
> Then, the /tmp/ directory is used to 
> 1. Setup the HTTP File Server
> 2. Broadcast directory
> 3. Fetch Dependency files or jars by Executors
> The size of the /tmp/ directory will keep growing. The free space of the 
> system disk will be less.
> I think we could add a configuration "spark.tmp.dir" in conf/spark-env.sh or 
> conf/spark-defaults.conf to set this particular directory. Let's say, set the 
> directory to a data disk.
> If "spark.tmp.dir" is not set, use the default "java.io.tmpdir"



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to