Github user YanTangZhai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3810#discussion_r22776305
  
    --- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -55,13 +57,9 @@ private[spark] class Client(
        * 
-------------------------------------------------------------------------------------
 */
     
       /**
    -   * Submit an application running our ApplicationMaster to the 
ResourceManager.
    -   *
    -   * The stable Yarn API provides a convenience method 
(YarnClient#createApplication) for
    -   * creating applications and setting up the application submission 
context. This was not
    -   * available in the alpha API.
    +   * Create an application running our ApplicationMaster to the 
ResourceManager.
        */
    -  override def submitApplication(): ApplicationId = {
    +  override def createApplication(): ApplicationId = {
    --- End diff --
    
    SparkContext firstly gets applicationId from taskScheduler and uses it to 
initialize blockManager and eventLogger. And then dagScheduler runs job and 
submits resources requests to cluster master.
    Getting applicationId and submitting resources requests to cluster master 
are split into two methods.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to