GitHub user lgrcyanny opened a pull request:

    https://github.com/apache/spark/pull/19079

    [SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

    ## What changes were proposed in this pull request?
    when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
    This exception only happens on driver, SparkFiles.get on executor works 
fine.
    we can reproduce the bug as follows:
    ```scala
    val conf = new SparkConf().setAppName("SparkFilesTest")
    val sc = new SparkContext(conf)
    def testOnDriver(fileName: String) = {
        val file = new File(SparkFiles.get(fileName))
        if (!file.exists()) {
            println(s"$file not exist")
        } else {
            // print file content on driver
            val content = Source.fromFile(file).getLines().mkString("\n")
            println(s"File content: ${content}")
        }
    }
    // the output will be file not exist
    ```
    
    ```python
    conf = SparkConf().setAppName("test files")
    sc = SparkContext(appName="spark files test")
    def test_on_driver(filename):
        file = SparkFiles.get(filename)
        print("file path: {}".format(file))
        if os.path.exists(file):
            with open(file) as f:
            lines = f.readlines()
            print(lines)
        else:
            print("file doesn't exist")
            run_command("ls .")
    ```
    the output will be file not exist
    
    ## How was this patch tested?
    
    tested in integration tests and manual tests
    submit the demo case in yarn-cluster and yarn-client mode, and verify the 
test result
    the testing  commands are:
    ```
    ./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
    ./bin/spark-submit --master yarn-client --files README.md test_get_files.py
    ```


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/lgrcyanny/spark fix-yarn-files-problem

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19079.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19079
    
----
commit 3f0e4a88bdb7156b5db7cfb56cd079d4b0de3a5b
Author: lgrcyanny <lgrcya...@gmail.com>
Date:   2017-05-07T12:51:55Z

    [SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode
    
    when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
    This exception only happens on driver, SparkFiles.get on executor works 
fine.
    we can reproduce the bug as follows:
    ```scala
    val conf = new SparkConf().setAppName("SparkFilesTest")
    val sc = new SparkContext(conf)
    def testOnDriver(fileName: String) = {
        val file = new File(SparkFiles.get(fileName))
        if (!file.exists()) {
            println(s"$file not exist")
        } else {
            // print file content on driver
            val content = Source.fromFile(file).getLines().mkString("\n")
            println(s"File content: ${content}")
        }
    }
    // the output will be file not exist
    ```
    
    ```python
    conf = SparkConf().setAppName("test files")
    sc = SparkContext(appName="spark files test")
    def test_on_driver(filename):
        file = SparkFiles.get(filename)
        print("file path: {}".format(file))
        if os.path.exists(file):
            with open(file) as f:
            lines = f.readlines()
            print(lines)
        else:
            print("file doesn't exist")
            run_command("ls .")
    ```
    the output will be file not exist
    
    tested in integration tests and manual tests
    submit the demo case in yarn-cluster and yarn-client mode, and verify the 
test result
    
    ```
    ./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
    ./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
    ./bin/spark-submit --master yarn-client --files README.md test_get_files.py
    ```
    
    Change-Id: I22034f99f571a451b862c1806b7f9350c6133c95

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to