Github user yaooqinn commented on the issue:

    https://github.com/apache/spark/pull/18666
  
    @gatorsmile would you plz take a look at this.
    
    this pr mainly want to close HiveSessionState explicitly to delete 
`hive.downloaded.resources.dir` which points to `"${system:java.io.tmpdir}" + 
File.separator + "${hive.session.id}_resources"`  by default 
`hive.exec.local.scratchdir` which points to `"${system:java.io.tmpdir}" + 
File.separator + "${system:user.name}"` by default and some other dirs which 
used only for hive but without deleting hook on shutdown.
    
    the below code is how HiveSessionState create 
`hive.downloaded.resources.dir`, `isCleanUp` is set to `false`. 
    
    ```scala
    // 3. Download resources dir
    path = new Path(HiveConf.getVar(conf, 
HiveConf.ConfVars.DOWNLOADED_RESOURCES_DIR));
    createPath(conf, path, scratchDirPermission, true, **isCleanUp** = false);
    ````
     Plenty of unused dirs left after submit a lot of Hive supported spark 
applications.
    ![popo_2018-03-20 
10-28-34](https://user-images.githubusercontent.com/8326978/37632505-7eacbec2-2c29-11e8-94b5-229ba193339f.jpg)



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to