[ 
https://issues.apache.org/jira/browse/LIVY-405?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tim Cederquist updated LIVY-405:
--------------------------------
    Comment: was deleted

(was: Fixed the livy impersonating user but stuck on the next one still related 
potentially to this client ip issue. Looks like Livy starts up but then gets 
stuck with this message and doesn't get past it ideas??

yarn app logs:
17/09/24 21:37:49 INFO utils.LineBufferedStream: stdout: 2017-09-24 
21:37:49,114 INFO  [pool-2-thread-1] spark.SparkContext 
(Logging.scala:logInfo(58)) - Added JAR 
file:/opt/livy/repl_2.10-jars/livy-core_2.10-0.4.0-incubating.jar at 
spark://10.208.2.186:33908/jars/livy-core_2.10-0.4.0-incubating.jar with 
timestamp 1506289069114
17/09/24 21:37:49 INFO utils.LineBufferedStream: stdout: 2017-09-24 
21:37:49,114 INFO  [pool-2-thread-1] spark.SparkContext 
(Logging.scala:logInfo(58)) - Added JAR 
file:/opt/livy/repl_2.10-jars/livy-repl_2.10-0.4.0-incubating.jar at 
spark://10.208.2.186:33908/jars/livy-repl_2.10-0.4.0-incubating.jar with 
timestamp 1506289069114
17/09/24 21:37:49 INFO utils.LineBufferedStream: stdout: 2017-09-24 
21:37:49,288 INFO  [pool-2-thread-1] client.RMProxy 
(RMProxy.java:createRMProxy(123)) - Connecting to ResourceManager at 
/0.0.0.0:8032
17/09/24 21:37:50 INFO utils.LineBufferedStream: stdout: 2017-09-24 
21:37:50,480 INFO  [pool-2-thread-1] ipc.Client 
(Client.java:handleConnectionFailure(898)) - Retrying connect to server: 
0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); retry policy is 
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
17/09/24 21:37:51 INFO utils.LineBufferedStream: stdout: 2017-09-24 
21:37:51,483 INFO  [pool-2-thread-1] ipc.Client 
(Client.java:handleConnectionFailure(898)) - Retrying connect to server: 
0.0.0.0/0.0.0.0:8032. Already tried 1 time(s); retry policy is 
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)

goes on til the end of the log with this series. Call from the client python 
program result in 'starting' status. with this output:
...
         '2017-09-24 21:43:24,752 INFO  [pool-2-thread-1] ipc.Client '
         '(Client.java:handleConnectionFailure(898)) - Retrying connect to '
         'server: 0.0.0.0/0.0.0.0:8032. Already tried 9 time(s); retry policy '
         'is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 '
         'MILLISECONDS)',
         '\nstderr: ',
         '\nYARN Diagnostics: '],
 'owner': 'tcederquist',
 'proxyUser': 'tcederquist',
 'state': 'starting'}
)

> ERROR RSCClient: Failed to connect to context
> ---------------------------------------------
>
>                 Key: LIVY-405
>                 URL: https://issues.apache.org/jira/browse/LIVY-405
>             Project: Livy
>          Issue Type: Question
>          Components: RSC
>    Affects Versions: 0.4
>         Environment: Ubuntu 16.04LTS + Spark 2.1 + Hadoop 2.8 + Zeppelin 0.8
> I'm using the clustered Spark based on Hadoop 2.8.
>            Reporter: Inhwan Jung
>            Priority: Minor
>
> Hello,
> I've been trying to resolve it, but I couldn't. 
> Please  help me.
> spark@alpha001:/usr/local/livy$ pg test.py
> import json, pprint, requests, textwrap
> host = 'http://192.168.0.69:8998'
> data = {'kind': 'spark'}
> headers = {'Content-Type': 'application/json'}
> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
> pprint.pprint(r.json())
> #session_url = 'http://localhost:8998/sessions/0'
> #requests.delete(session_url, headers=headers)
> spark@alpha001:/usr/local/livy$ python test.py
> 17/09/20 07:15:00 WARN InteractiveSession$: Enable HiveContext but no 
> hive-site.xml found under classpath or user request.
> 17/09/20 07:15:00 INFO InteractiveSession$: Creating Interactive session 0: 
> [owner: null, request: [kind: spark, proxyUser: None, 
> heartbeatTimeoutInSecond: 0]]
> 17/09/20 07:15:01 INFO RpcServer: Connected to the port 10000
> 17/09/20 07:15:01 WARN RSCConf: Your hostname, alpha001, resolves to a 
> loopback address, but we couldn't find any external IP address!
> 17/09/20 07:15:01 WARN RSCConf: Set livy.rsc.rpc.server.address if you need 
> to bind to another address.
> 17/09/20 07:15:01 INFO InteractiveSessionManager: Registering new session 0
> {'appId': None,
>  'appInfo': {'driverLogUrl': None, 'sparkUiUrl': None},
>  'id': 0,
>  'kind': 'spark',
>  'log': ['stdout: ', '\nstderr: '],
>  'owner': None,
>  'proxyUser': None,
>  'state': 'starting'}
> spark@alpha001:/usr/local/livy$ 17/09/20 07:15:01 INFO LineBufferedStream: 
> stdout: Running Spark using the REST application submission protocol.
> 17/09/20 07:15:01 INFO LineBufferedStream: stdout: 17/09/20 07:15:01 WARN 
> SparkConf: The configuration key 'spark.yarn.jar' has been deprecated as of 
> Spark 2.0 and may be removed in the future. Please use the new key 
> 'spark.yarn.jars' instead.
> 17/09/20 07:15:01 INFO LineBufferedStream: stdout: 17/09/20 07:15:01 INFO 
> RestSubmissionClient: Submitting a request to launch an application in 
> spark://192.168.0.69:6066.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO 
> RestSubmissionClient: Submission successfully created as 
> driver-20170920071502-0000. Polling submission state...
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO 
> RestSubmissionClient: Submitting a request for the status of submission 
> driver-20170920071502-0000 in spark://192.168.0.69:6066.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO 
> RestSubmissionClient: State of driver driver-20170920071502-0000 is now 
> RUNNING.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO 
> RestSubmissionClient: Driver is running on worker 
> worker-20170920071400-192.168.0.47-43404 at 192.168.0.47:43404.
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: 17/09/20 07:15:02 INFO 
> RestSubmissionClient: Server responded with CreateSubmissionResponse:
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: {
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "action" : 
> "CreateSubmissionResponse",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "message" : "Driver 
> successfully submitted as driver-20170920071502-0000",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "serverSparkVersion" : 
> "2.1.1",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "submissionId" : 
> "driver-20170920071502-0000",
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout:   "success" : true
> 17/09/20 07:15:02 INFO LineBufferedStream: stdout: }
> 17/09/20 07:16:31 ERROR RSCClient: Failed to connect to context.
> java.util.concurrent.TimeoutException: Timed out waiting for context to start.
>         at 
> org.apache.livy.rsc.ContextLauncher.connectTimeout(ContextLauncher.java:134)
>         at 
> org.apache.livy.rsc.ContextLauncher.access$300(ContextLauncher.java:63)
>         at org.apache.livy.rsc.ContextLauncher$2.run(ContextLauncher.java:122)
>         at 
> io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
>         at 
> io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
>         at 
> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>         at 
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>         at java.lang.Thread.run(Thread.java:748)
> 17/09/20 07:16:31 INFO RSCClient: Failing pending job 
> 38e22e17-444f-4712-a587-a77c89b214c3 due to shutdown.
> 17/09/20 07:16:31 INFO InteractiveSession: Failed to ping RSC driver for 
> session 0. Killing application.
> 17/09/20 07:16:31 INFO InteractiveSession: Stopping InteractiveSession 0...
> 17/09/20 07:16:31 INFO InteractiveSession: Stopped InteractiveSession 0.
> 17/09/20 07:16:31 WARN InteractiveSession: (Fail to get rsc 
> uri,java.util.concurrent.ExecutionException: 
> java.util.concurrent.TimeoutException: Timed out waiting for context to 
> start.)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to