GitHub user tgravescs opened a pull request:

    https://github.com/apache/spark/pull/14999

    [SPARK-17433] YarnShuffleService doesn't handle moving credentials levelDb 

    The secrets leveldb isn't being moved if you run spark shuffle services 
without yarn nm recovery on and then turn it on.  This fixes that.  I 
unfortunately missed this when I ported the patch from our internal branch 2 to 
master branch due to the changes for the recovery path.  Note this only applies 
to master since it is the only place the yarn nm recovery dir is used.
    
    Unit tests ran and tested on 8 node cluster.  Fresh startup with NM 
recovery, fresh startup no nm recovery, switching between no nm recovery and 
recovery.  Also tested running applications to make sure wasn't affected by 
rolling upgrade.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/tgravescs/spark SPARK-17433

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/14999.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #14999
    
----
commit 65de8531ccb91287f5a8a749c7819e99533b9440
Author: Thomas Graves <[email protected]>
Date:   2016-09-07T17:38:37Z

    [SPARK-17433] YarnShuffleService doesn't handle moving credentials levelDb

commit ea0ed0eb6419bbffc4ac21d1834fcb24e14ab743
Author: Thomas Graves <[email protected]>
Date:   2016-09-07T18:55:49Z

    Add file exist test to make sure moved

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to