GitHub user LantaoJin opened a pull request:

    https://github.com/apache/spark/pull/19933

    [SPARK-22744][CORE] Add a configuration to show the application submi…

    …t hostname
    
    ## What changes were proposed in this pull request?
    
    In MapReduce, we can get the submit hostname via checking the value of 
configuration **mapreduce.job.submithostname**. It can help infra team or other 
people to do support work and debug. Bu in Spark, the information is not 
included.
    
    Then, I suggest to introduce a new configuration parameter, 
**spark.submit.hostname**, which will be set automatically by default, but it 
also allows to set this to an user defined hostname if needed (e.g, using an 
user node instead of Livy server node).
    
    ## How was this patch tested?
    
    unit tests


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/LantaoJin/spark SPARK-22744

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19933.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19933
    
----
commit 616a02d3cb90d92712ee6de9647a0ccc81200d2a
Author: LantaoJin <jinlan...@gmail.com>
Date:   2017-12-09T09:05:10Z

    [SPARK-22744][CORE] Add a configuration to show the application submit 
hostname

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to