[ https://issues.apache.org/jira/browse/LIVY-436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Maziyar PANAHI updated LIVY-436: -------------------------------- Comment: was deleted (was: Hi Jeff, The logs from driver or all the logs generated from each executors?) > Client RPC channel closed unexpectedly > -------------------------------------- > > Key: LIVY-436 > URL: https://issues.apache.org/jira/browse/LIVY-436 > Project: Livy > Issue Type: Bug > Affects Versions: 0.4.0 > Environment: Cloudera CDH 5.14 > Version Spark 2.2.0.cloudera2 > Scala version 2.11.8 > Apache Livy 0.4.0 > Java version "1.8.0_161" > Apache Zeppelin 0.7.3 > Ubuntu 16.04 LTS > Reporter: Maziyar PANAHI > Priority: Major > > Hi, > We are getting this error time to time and we have to re-create the session > and start the execution of our codes from the beginning. However, it is > really hard to find out why the error happens since it's not always the same > code nor the Spark commands. The session is suddenly closed without any WARN > or ERROR in different time with different code. > I checked the logs from YARN and Spark and there is nothing that indicates > the session was closed because of any error or warning before hand. > I have Livy server and Apache Zeppelin on the same machine so the IP address > is the same for both of them. > I am just wondering, what could cause this error in a multi-user environment. > (number of connections, open files, heap size, etc.). Is it the connection > unstable or is there some configs that could help preventing this sudden > death of sessions. > > Livy conf on Zeppelin: > > {noformat} > livy.spark.driver.cores 4 > livy.spark.driver.memory 4g > livy.spark.dynamicAllocation.enabled true > livy.spark.executor.cores 5 > livy.spark.executor.memory 5g > zeppelin.interpreter.output.limit 102400 > zeppelin.livy.concurrentSQL false > zeppelin.livy.displayAppInfo false > zeppelin.livy.pull_status.interval.millis 1000 > zeppelin.livy.session.create_timeout 120 > zeppelin.livy.spark.sql.maxResult 10000 > zeppelin.spark.printREPLOutput true > {noformat} > > {code:java} > 18/01/26 10:08:59 WARN RSCClient: Client RPC channel closed unexpectedly. > 18/01/26 10:08:59 WARN Rpc: [id: 0x94c07360, /Livy-IP-Address:57846 :> > executor-hostname/executor-IP-Address:10000] CLOSE() > 18/01/26 10:08:59 WARN RSCClient: Error stopping RPC. > io.netty.util.concurrent.BlockingOperationException: > DefaultChannelPromise@1062b60(uncancellable) > at > io.netty.util.concurrent.DefaultPromise.checkDeadLock(DefaultPromise.java:390) > at > io.netty.channel.DefaultChannelPromise.checkDeadLock(DefaultChannelPromise.java:157) > at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:251) > at > io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:129) > at > io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:28) > at io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:218) > at > io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:117) > at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:28) > at org.apache.livy.rsc.rpc.Rpc.close(Rpc.java:307) > at org.apache.livy.rsc.RSCClient.stop(RSCClient.java:232) > at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:129) > at org.apache.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:123) > at org.apache.livy.rsc.Utils$2.operationComplete(Utils.java:108) > at > io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680) > at > io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:567) > at > io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:406) > at > io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82) > at > io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:956) > at > io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:608) > at > io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:586) > at > io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71) > at > io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:158) > at > io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) > at > io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) > at > io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) > at > io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) > at java.lang.Thread.run(Thread.java:748) > 18/01/26 10:08:59 WARN Rpc: [id: 0x94c07360,/Livy-IP-Address:57846 :> > executor-hostname/executor-IP-Address:10000] INACTIVE > 18/01/26 10:08:59 WARN Rpc: [id: 0x94c07360,/Livy-IP-Address:57846 :> > executor-hostname/executor-IP-Address:10000] UNREGISTERED > 18/01/26 10:09:09 ERROR SessionServlet$: internal error > java.lang.IllegalStateException: Session is in state dead > at > org.apache.livy.server.interactive.InteractiveSession.ensureRunning(InteractiveSession.scala:568) > at > org.apache.livy.server.interactive.InteractiveSession.executeStatement(InteractiveSession.scala:490) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11$$anonfun$apply$6.apply(InteractiveSessionServlet.scala:124) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11$$anonfun$apply$6.apply(InteractiveSessionServlet.scala:123) > at > org.apache.livy.server.interactive.SessionHeartbeatNotifier$$anonfun$withModifyAccessSession$1.apply(SessionHeartbeat.scala:76) > at > org.apache.livy.server.interactive.SessionHeartbeatNotifier$$anonfun$withModifyAccessSession$1.apply(SessionHeartbeat.scala:74) > at > org.apache.livy.server.SessionServlet.doWithSession(SessionServlet.scala:221) > at > org.apache.livy.server.SessionServlet.withModifyAccessSession(SessionServlet.scala:212) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.org$apache$livy$server$interactive$SessionHeartbeatNotifier$$super$withModifyAccessSession(InteractiveSessionServlet.scala:40) > at > org.apache.livy.server.interactive.SessionHeartbeatNotifier$class.withModifyAccessSession(SessionHeartbeat.scala:74) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.withModifyAccessSession(InteractiveSessionServlet.scala:40) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11.apply(InteractiveSessionServlet.scala:123) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11.apply(InteractiveSessionServlet.scala:122) > at > org.apache.livy.server.JsonServlet.org$apache$livy$server$JsonServlet$$doAction(JsonServlet.scala:113) > at > org.apache.livy.server.JsonServlet$$anonfun$jpost$1.apply(JsonServlet.scala:75) > at > org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$liftAction(ScalatraBase.scala:270) > at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265) > at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265) > at org.scalatra.ApiFormats$class.withRouteMultiParams(ApiFormats.scala:178) > at > org.apache.livy.server.JsonServlet.withRouteMultiParams(JsonServlet.scala:39) > at org.scalatra.ScalatraBase$class.invoke(ScalatraBase.scala:264) > at org.scalatra.ScalatraServlet.invoke(ScalatraServlet.scala:49) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:240) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:238) > at scala.Option.flatMap(Option.scala:170) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:238) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:237) > at scala.collection.immutable.Stream.flatMap(Stream.scala:446) > at org.scalatra.ScalatraBase$class.runRoutes(ScalatraBase.scala:237) > at org.scalatra.ScalatraServlet.runRoutes(ScalatraServlet.scala:49) > at org.scalatra.ScalatraBase$class.runActions$1(ScalatraBase.scala:163) > at > org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply$mcV$sp(ScalatraBase.scala:175) > at > org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175) > at > org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175) > at > org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$cradleHalt(ScalatraBase.scala:193) > at org.scalatra.ScalatraBase$class.executeRoutes(ScalatraBase.scala:175) > at org.scalatra.ScalatraServlet.executeRoutes(ScalatraServlet.scala:49) > at > org.scalatra.ScalatraBase$$anonfun$handle$1.apply$mcV$sp(ScalatraBase.scala:113) > at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113) > at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80) > at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49) > at > org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71) > at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49) > at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59) > at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49) > at org.scalatra.ScalatraBase$class.handle(ScalatraBase.scala:111) > at > org.scalatra.ScalatraServlet.org$scalatra$servlet$ServletBase$$super$handle(ScalatraServlet.scala:49) > at org.scalatra.servlet.ServletBase$class.handle(ServletBase.scala:43) > at > org.apache.livy.server.SessionServlet.org$scalatra$MethodOverride$$super$handle(SessionServlet.scala:39) > at org.scalatra.MethodOverride$class.handle(MethodOverride.scala:28) > at > org.apache.livy.server.SessionServlet.org$scalatra$GZipSupport$$super$handle(SessionServlet.scala:39) > at > org.scalatra.GZipSupport$$anonfun$handle$1.apply$mcV$sp(GZipSupport.scala:36) > at org.scalatra.GZipSupport$$anonfun$handle$1.apply(GZipSupport.scala:19) > at org.scalatra.GZipSupport$$anonfun$handle$1.apply(GZipSupport.scala:19) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80) > at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49) > at > org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71) > at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49) > at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59) > at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49) > [7557/9847] > at org.scalatra.GZipSupport$class.handle(GZipSupport.scala:18) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.org$scalatra$servlet$FileUploadSupport$$super$handle(InteractiveSessionServlet.scala:40) > at > org.scalatra.servlet.FileUploadSupport$class.handle(FileUploadSupport.scala:93) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.handle(InteractiveSessionServlet.scala:40) > at org.scalatra.ScalatraServlet.service(ScalatraServlet.scala:54) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) > at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812) > at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) > at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) > at org.eclipse.jetty.server.Server.handle(Server.java:499) > at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311) > at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) > at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544) > at > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) > at > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) > at java.lang.Thread.run(Thread.java:748) > 18/01/26 10:09:11 ERROR SessionServlet$: internal error > java.lang.IllegalStateException: Session is in state dead > at > org.apache.livy.server.interactive.InteractiveSession.ensureRunning(InteractiveSession.scala:568) > at > org.apache.livy.server.interactive.InteractiveSession.executeStatement(InteractiveSession.scala:490) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11$$anonfun$apply$6.apply(InteractiveSessionServlet.scala:124) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11$$anonfun$apply$6.apply(InteractiveSessionServlet.scala:123) > at > org.apache.livy.server.interactive.SessionHeartbeatNotifier$$anonfun$withModifyAccessSession$1.apply(SessionHeartbeat.scala:76) > at > org.apache.livy.server.interactive.SessionHeartbeatNotifier$$anonfun$withModifyAccessSession$1.apply(SessionHeartbeat.scala:74) > at > org.apache.livy.server.SessionServlet.doWithSession(SessionServlet.scala:221) > at > org.apache.livy.server.SessionServlet.withModifyAccessSession(SessionServlet.scala:212) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.org$apache$livy$server$interactive$SessionHeartbeatNotifier$$super$withModifyAccessSession(InteractiveSessionServlet.scala:40) > at > org.apache.livy.server.interactive.SessionHeartbeatNotifier$class.withModifyAccessSession(SessionHeartbeat.scala:74) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.withModifyAccessSession(InteractiveSessionServlet.scala:40) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11.apply(InteractiveSessionServlet.scala:123) > at > org.apache.livy.server.interactive.InteractiveSessionServlet$$anonfun$11.apply(InteractiveSessionServlet.scala:122) > at > org.apache.livy.server.JsonServlet.org$apache$livy$server$JsonServlet$$doAction(JsonServlet.scala:113) > at > org.apache.livy.server.JsonServlet$$anonfun$jpost$1.apply(JsonServlet.scala:75) > at > org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$liftAction(ScalatraBase.scala:270) > at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265) > at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265) > at org.scalatra.ApiFormats$class.withRouteMultiParams(ApiFormats.scala:178) > at > org.apache.livy.server.JsonServlet.withRouteMultiParams(JsonServlet.scala:39) > at org.scalatra.ScalatraBase$class.invoke(ScalatraBase.scala:264) > at org.scalatra.ScalatraServlet.invoke(ScalatraServlet.scala:49) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:240) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:238) > at scala.Option.flatMap(Option.scala:170) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:238) > at > org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:237) > at scala.collection.immutable.Stream.flatMap(Stream.scala:446) > at org.scalatra.ScalatraBase$class.runRoutes(ScalatraBase.scala:237) > at org.scalatra.ScalatraServlet.runRoutes(ScalatraServlet.scala:49) > at org.scalatra.ScalatraBase$class.runActions$1(ScalatraBase.scala:163) > at > org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply$mcV$sp(ScalatraBase.scala:175) > at > org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175) > at > org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175) > at > org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$cradleHalt(ScalatraBase.scala:193) > at org.scalatra.ScalatraBase$class.executeRoutes(ScalatraBase.scala:175) > at org.scalatra.ScalatraServlet.executeRoutes(ScalatraServlet.scala:49) > at > org.scalatra.ScalatraBase$$anonfun$handle$1.apply$mcV$sp(ScalatraBase.scala:113) > at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113) > at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80) > at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49) > at > org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71) > at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49) > at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59) > at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49) > at org.scalatra.ScalatraBase$class.handle(ScalatraBase.scala:111) > at > org.scalatra.ScalatraServlet.org$scalatra$servlet$ServletBase$$super$handle(ScalatraServlet.scala:49) > at org.scalatra.servlet.ServletBase$class.handle(ServletBase.scala:43) > at > org.apache.livy.server.SessionServlet.org$scalatra$MethodOverride$$super$handle(SessionServlet.scala:39) > at org.scalatra.MethodOverride$class.handle(MethodOverride.scala:28) > at > org.apache.livy.server.SessionServlet.org$scalatra$GZipSupport$$super$handle(SessionServlet.scala:39) > at > org.scalatra.GZipSupport$$anonfun$handle$1.apply$mcV$sp(GZipSupport.scala:36) > at org.scalatra.GZipSupport$$anonfun$handle$1.apply(GZipSupport.scala:19) > at org.scalatra.GZipSupport$$anonfun$handle$1.apply(GZipSupport.scala:19) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80) > at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49) > at > org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60) > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) > at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71) > at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49) > at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59) > at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49) > at org.scalatra.GZipSupport$class.handle(GZipSupport.scala:18) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.org$scalatra$servlet$FileUploadSupport$$super$handle(InteractiveSessionServlet.scala:40) > at > org.scalatra.servlet.FileUploadSupport$class.handle(FileUploadSupport.scala:93) > at > org.apache.livy.server.interactive.InteractiveSessionServlet.handle(InteractiveSessionServlet.scala:40) > at org.scalatra.ScalatraServlet.service(ScalatraServlet.scala:54) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) > at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812) > at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) > at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) > at org.eclipse.jetty.server.Server.handle(Server.java:499) > at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311) > at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) > at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544) > at > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) > at > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) > at java.lang.Thread.run(Thread.java:748) > {code} > UPDATE: > New logs from Driver of the application that has its session closed: > > {code:java} > 18/01/31 11:11:39 INFO rpc.Rpc: [id: 0xe08e0a06, L:/IP:10000 - R:/IP:49670] > FLUSH > 18/01/31 11:11:41 ERROR yarn.ApplicationMaster: RECEIVED SIGNAL TERM > 18/01/31 11:11:41 INFO spark.SparkContext: Invoking stop() from shutdown hook > 18/01/31 11:11:41 INFO server.AbstractConnector: Stopped > Spark@b34ec2e{HTTP/1.1,[http/1.1]}{0.0.0.0:0} > 18/01/31 11:11:41 INFO ui.SparkUI: Stopped Spark web UI at http://IP:45393 > 18/01/31 11:11:41 INFO yarn.YarnAllocator: Driver requested a total number of > 0 executor(s). > 18/01/31 11:11:41 INFO cluster.YarnClusterSchedulerBackend: Shutting down all > executors > 18/01/31 11:11:41 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: > Asking each executor to shut down > 18/01/31 11:11:41 INFO cluster.SchedulerExtensionServices: Stopping > SchedulerExtensionServices > 18/01/31 11:11:41 ERROR yarn.ApplicationMaster: Exception from Reporter > thread. > org.apache.hadoop.yarn.exceptions.ApplicationAttemptNotFoundException: > Application attempt appattempt_1517336845756_0005_000001 doesn't exist in > ApplicationMasterService cache. at > org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.allocate(ApplicationMasterService.java:442) > at > org.apache.hadoop.yarn.api.impl.pb.service.ApplicationMasterProtocolPBServiceImpl.allocate(ApplicationMasterProtocolPBServiceImpl.java:60) > at > org.apache.hadoop.yarn.proto.ApplicationMasterProtocol$ApplicationMasterProtocolService$2.callBlockingMethod(ApplicationMasterProtocol.java:99) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at > java.security.AccessController.doPrivileged(Native Method) at > javax.security.auth.Subject.doAs(Subject.java:422) at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) at > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at > org.apache.hadoop.yarn.ipc.RPCUtil.instantiateException(RPCUtil.java:53) at > org.apache.hadoop.yarn.ipc.RPCUtil.unwrapAndThrowException(RPCUtil.java:101) > at > org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.allocate(ApplicationMasterProtocolPBClientImpl.java:79) > at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source) at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) > at com.sun.proxy.$Proxy20.allocate(Unknown Source) at > org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl.allocate(AMRMClientImpl.java:277) > at > org.apache.spark.deploy.yarn.YarnAllocator.allocateResources(YarnAllocator.scala:265) > at > org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:498) > Caused by: > org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.yarn.exceptions.ApplicationAttemptNotFoundException): > Application attempt appattempt_1517336845756_0005_000001 doesn't exist in > ApplicationMasterService cache. at > org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.allocate(ApplicationMasterService.java:442) > at > org.apache.hadoop.yarn.api.impl.pb.service.ApplicationMasterProtocolPBServiceImpl.allocate(ApplicationMasterProtocolPBServiceImpl.java:60) > at > org.apache.hadoop.yarn.proto.ApplicationMasterProtocol$ApplicationMasterProtocolService$2.callBlockingMethod(ApplicationMasterProtocol.java:99) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at > java.security.AccessController.doPrivileged(Native Method) at > javax.security.auth.Subject.doAs(Subject.java:422) at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) at > org.apache.hadoop.ipc.Client.call(Client.java:1504) at > org.apache.hadoop.ipc.Client.call(Client.java:1441) at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230) > at com.sun.proxy.$Proxy19.allocate(Unknown Source) at > org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.allocate(ApplicationMasterProtocolPBClientImpl.java:77) > ... 9 more 18/01/31 11:11:41 INFO spark.MapOutputTrackerMasterEndpoint: > MapOutputTrackerMasterEndpoint stopped! 18/01/31 11:11:41 INFO > yarn.ApplicationMaster: Final app status: FAILED, exitCode: 12, (reason: > Application attempt appattempt_1517336845756_0005_000001 doesn't exist in > ApplicationMasterService cache. at > org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.allocate(ApplicationMasterService.java:442) > at > org.apache.hadoop.yarn.api.impl.pb.service.ApplicationMasterProtocolPBServiceImpl.allocate(ApplicationMasterProtocolPBServiceImpl.java:60) > at > org.apache.hadoop.yarn.proto.ApplicationMasterProtocol$ApplicationMasterProtocolService$2.callBlockingMethod(ApplicationMasterProtocol.java:99) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at > java.security.AccessController.doPrivileged(Native Method) at > javax.security.auth.Subject.doAs(Subject.java:422) at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) ) 18/01/31 > 11:11:41 INFO memory.MemoryStore: MemoryStore cleared 18/01/31 11:11:41 INFO > storage.BlockManager: BlockManager stopped 18/01/31 11:11:41 INFO > storage.BlockManagerMaster: BlockManagerMaster stopped 18/01/31 11:11:41 INFO > scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: > OutputCommitCoordinator stopped! 18/01/31 11:11:41 INFO spark.SparkContext: > Successfully stopped SparkContext 18/01/31 11:11:41 INFO > yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag > message: Application attempt appattempt_1517336845756_0005_000001 doesn't > exist in ApplicationMasterService cache. at > org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.allocate(ApplicationMasterService.java:442) > at > org.apache.hadoop.yarn.api.impl.pb.service.ApplicationMasterProtocolPBServiceImpl.allocate(ApplicationMasterProtocolPBServiceImpl.java:60) > at > org.apache.hadoop.yarn.proto.ApplicationMasterProtocol$ApplicationMasterProtocolService$2.callBlockingMethod(ApplicationMasterProtocol.java:99) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at > java.security.AccessController.doPrivileged(Native Method) at > javax.security.auth.Subject.doAs(Subject.java:422) at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) ) 18/01/31 > 11:11:41 INFO yarn.ApplicationMaster: Deleting staging directory > hdfs://hadoop-master-1:8020/user/maziyar/.sparkStaging/application_1517336845756_0005 > 18/01/31 11:11:41 INFO util.ShutdownHookManager: Shutdown hook called > 18/01/31 11:11:41 INFO util.ShutdownHookManager: Deleting directory > /yarn/nm/usercache/maziyar/appcache/application_1517336845756_0005/spark-76631a15-316e-4101-8a40-9972e729b686 > stdout0 > > > {code} > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005)