[ 
https://issues.apache.org/jira/browse/SPARK-8850?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Herman van Hovell updated SPARK-8850:
-------------------------------------
    Attachment: open

Dump of all open files after a {{Too Many Files Open}} error.

The command used to make the dump:
{noformat}
lsof -c java > open
{noformat}

The job starts crashing after as soon as I start sorting 10000000 rows for the 
9th time (doing benchmarking). I guess files are left open after every 
benchmark? Is there a way to trigger the closing of files?

> Turn unsafe mode on by default
> ------------------------------
>
>                 Key: SPARK-8850
>                 URL: https://issues.apache.org/jira/browse/SPARK-8850
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>            Reporter: Reynold Xin
>            Assignee: Josh Rosen
>             Fix For: 1.5.0
>
>         Attachments: open
>
>
> Let's turn unsafe on and see what bugs we find in preparation for 1.5.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to