Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19374#discussion_r144682434
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -373,10 +374,16 @@ class SparkContext(config: SparkConf) extends Logging 
{
         // log out spark.app.name in the Spark driver logs
         logInfo(s"Submitted application: $appName")
     
    -    // System property spark.yarn.app.id must be set if user code ran by 
AM on a YARN cluster
    -    if (master == "yarn" && deployMode == "cluster" && 
!_conf.contains("spark.yarn.app.id")) {
    -      throw new SparkException("Detected yarn cluster mode, but isn't 
running on a cluster. " +
    -        "Deployment to YARN is not supported directly by SparkContext. 
Please use spark-submit.")
    +    // System property spark.yarn.app.id must be set if user code ran by 
AM on a YARN cluster or
    +    // System property spark.mesos.driver.frameworkId must be set if user 
code ran by
    +    // Mesos Dispatcher on a MESOS cluster
    +    if (deployMode == "cluster") {
    --- End diff --
    
    FWIW, I _believe_ that when we submit a job with the dispatcher 
`deployMode` is actually set to `client`, so this logic may not be invoked as 
expected. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to