vanzin commented on a change in pull request #19616: [SPARK-22404][YARN] 
Provide an option to use unmanaged AM in yarn-client mode
URL: https://github.com/apache/spark/pull/19616#discussion_r240422378
 
 

 ##########
 File path: 
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
 ##########
 @@ -481,20 +478,29 @@ private[spark] class ApplicationMaster(args: 
ApplicationMasterArguments) extends
   }
 
   private def runExecutorLauncher(): Unit = {
-    val hostname = Utils.localHostName
-    val amCores = sparkConf.get(AM_CORES)
-    rpcEnv = RpcEnv.create("sparkYarnAM", hostname, hostname, -1, sparkConf, 
securityMgr,
-      amCores, true)
-
-    // The client-mode AM doesn't listen for incoming connections, so report 
an invalid port.
-    registerAM(hostname, -1, sparkConf, 
sparkConf.getOption("spark.driver.appUIAddress"))
-
-    // The driver should be up and listening, so unlike cluster mode, just try 
to connect to it
-    // with no waiting or retrying.
-    val (driverHost, driverPort) = Utils.parseHostPort(args.userArgs(0))
-    val driverRef = rpcEnv.setupEndpointRef(
-      RpcAddress(driverHost, driverPort),
-      YarnSchedulerBackend.ENDPOINT_NAME)
+    var driverRef : RpcEndpointRef = null
+    if (sparkConf.get(YARN_UNMANAGED_AM)) {
 
 Review comment:
   I'm not a big fan of this change. Feels like you should have a different 
method here called `runUnmanaged` that is called instead of `run()`, and takes 
an `RpcEnv`.
   
   That way you don't need to keep `clientRpcEnv` at all since it would be 
local to that method, since nothing else here needs it. In fact even `rpcEnv` 
could go away and become a parameter to `createAllocator`...

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to