Hi Matt,

thanks for your answer. We tried connecting to Hive via the ZooKeeper namespace 
in DBeaver with the JDBC URL 
jdbc:hive2://master01:2181,master02:2181,master03:2181/;serviceDiscoveryMode=zookeeper;zooKeeperNamespace=hiveserver2
 and got a similar error:

„Could not open client transport for any of the Server URI's in ZooKeeper: 
Invalid status 21“

When looking into the namespace via the CLI, we get the correct address of our 
Hive instance returned:

ls /hiveserver2
[serverUri=master01:10000;version=3.1.0.3.0.1.0-187;sequence=0000000123]

Fun thing is that most of the clients, who are using the Thriftserver 
connection outside the cluster f.e. in Excel, are able to connect to it without 
the problem. We checked the truststores the clients have with the one’s inside 
the cluster and they are the same.

Best regards, Jan

From: Matt Andruff <[email protected]>
Reply-To: "[email protected]" <[email protected]>
Date: Wednesday, December 18, 2019 at 3:49 AM
To: "[email protected]" <[email protected]>
Subject: Re: Problem with Spark2 Thriftserver Alert in Ambari

So if your not able to connect with beeline with that trustore info you have a 
typo.  Either a typo in your connection string.(unlikely) or the trustore isn't 
valid.(a different type of typo)

Can you rules either of them out as being a typo?

(Does using the zookeeper namespace  work connection string for hive work). You 
know the one that's in Ambari, on the hive summary page?

Hive does actually right the connection string it wants you to use to the 
zookeeper namespace.  Can you connect to zookeeper and check what string it 
wants you to use? (Send me a reply and I'll dig up how to do this if you don't 
know how to do it. If you have more than 1 hive instance their will be more 
than 1 entry... I'm not going to respond until tomorrow just so you know ).



On Tue., Dec. 17, 2019, 12:00 Jan Hentschel, 
<[email protected]<mailto:[email protected]>> wrote:
We already tried the SSL configuration in the Beeline connection string with a 
similar result. Hive is in binary mode with TLS. Below the output when trying 
to use the Truststore parameters in the connection string (resulting in the 
same output in the Thriftserver logs):

spark@master01:/$ /usr/hdp/current/spark2-client/bin/beeline -u 
'jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
 ' -e 'show databases'
Connecting to 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:20 INFO Utils: Supplied authorities: master01:10016
19/11/01 10:35:20 INFO Utils: Resolved authority: master01:10016
19/11/01 10:35:20 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
19/11/01 10:35:20 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:21 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:21 INFO HiveConnection: Transport Used for JDBC connection: 
binary
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit:
 Invalid status 21 (state=08S01,code=0)
19/11/01 10:35:21 INFO Utils: Supplied authorities: master01:10016
19/11/01 10:35:21 INFO Utils: Resolved authority: master01:10016
19/11/01 10:35:21 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:21 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:21 INFO HiveConnection: Transport Used for JDBC connection: 
binary
No current connection
19/11/01 10:35:21 INFO Utils: Supplied authorities: master01:10016
19/11/01 10:35:21 INFO Utils: Resolved authority: master01:10016
19/11/01 10:35:21 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:21 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit
19/11/01 10:35:21 INFO HiveConnection: Transport Used for JDBC connection: 
binary
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary;ssl=true;sslTrustStore=/etc/ssl/jks/certs/truststore.jks;trustStorePassword=changeit:
 Invalid status 21 (state=08S01,code=0)

From: Matt Andruff <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Date: Friday, December 13, 2019 at 8:17 PM
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: Re: Problem with Spark2 Thriftserver Alert in Ambari

It's clear from the Ambati alert the thrift server connection it's trying to 
use binary connection (and missing the trustore).  Are you using hive in binary 
with SSL or https?

On Fri., Dec. 13, 2019, 14:13 Matt Andruff, 
<[email protected]<mailto:[email protected]>> wrote:
What happens when you try to connect via beeline with:

jdbc:hive2://<host>:<port>/<db>;ssl=true;sslTrustStore=<trust_store_path>;trustStorePassword=<trust_store_password>

?

Or are you just using the zookeeper namespace to connect?

On Thu., Dec. 12, 2019, 14:03 Jan Hentschel, 
<[email protected]<mailto:[email protected]>> wrote:
Hello everybody,

we're having problems with the Ambari alert for the Spark Thriftserver, 
complaining about the Beeline connection to the Thriftserver (see the alert 
below):

Connection failed on host master01:10016 (Traceback (most recent call last):
  File 
"/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/alerts/alert_spark2_thrift_port.py",
 line 147, in execute
    Execute(cmd, user=sparkuser, path=[beeline_cmd], 
timeout=CHECK_COMMAND_TIMEOUT_DEFAULT)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, 
in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", 
line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", 
line 124, in run_action
    provider_action()
  File 
"/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 
263, in action_run
    returns=self.resource.returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, 
in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, 
in checked_call
    tries=tries, try_sleep=try_sleep, 
timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, 
in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, 
in _call
    raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of '! /usr/hdp/current/spark2-client/bin/beeline -u 
'jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary'
  -e '' 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL' ' 
returned 1. Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary:
 java.net.ConnectException: Connection refused (Connection refused) 
(state=08S01,code=0)
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary:
 java.net.ConnectException: Connection refused (Connection refused) 
(state=08S01,code=0)
)

Running it with a user the same problem comes up:

dr_who@master01:/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/alerts$
 ! /usr/hdp/current/spark2-client/bin/beeline -u 
'jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary'
  -e ''
Connecting to 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/09/30 10:40:26 INFO Utils: Supplied authorities: master01:10016
19/09/30 10:40:26 INFO Utils: Resolved authority: master01:10016
19/09/30 10:40:26 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
19/09/30 10:40:26 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Transport Used for JDBC connection: 
binary
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary:
 java.net.SocketException: Connection reset (state=08S01,code=0)
Beeline version 1.21.2.3.0.1.0-187 by Apache Hive
19/09/30 10:40:27 INFO Utils: Supplied authorities: master01:10016
19/09/30 10:40:27 INFO Utils: Resolved authority: master01:10016
19/09/30 10:40:27 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/09/30 10:40:27 INFO HiveConnection: Transport Used for JDBC connection: 
binary
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary:
 java.net.SocketException: Broken pipe (Write failed) (state=08S01,code=0)

When trying to run the same with the Spark user from the command line, a 
similar issues arises:

spark@master01:~$ /usr/hdp/current/spark2-client/bin/beeline -u 
'jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary'
  -e 'show databases'
Connecting to 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO Utils: Supplied authorities: master01:10016
19/11/13 14:42:27 INFO Utils: Resolved authority: master01:10016
19/11/13 14:42:27 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
19/11/13 14:42:27 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Transport Used for JDBC connection: 
binary
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary:
 Invalid status 21 (state=08S01,code=0)
19/11/13 14:42:27 INFO Utils: Supplied authorities: master01:10016
19/11/13 14:42:27 INFO Utils: Resolved authority: master01:10016
19/11/13 14:42:27 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Transport Used for JDBC connection: 
binary
No current connection
19/11/13 14:42:27 INFO Utils: Supplied authorities: master01:10016
19/11/13 14:42:27 INFO Utils: Resolved authority: master01:10016
19/11/13 14:42:27 INFO HiveConnection: Will try to open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Could not open client transport with 
JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary
19/11/13 14:42:27 INFO HiveConnection: Transport Used for JDBC connection: 
binary
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://master01:10016/default;principal=spark/[email protected]<mailto:[email protected]>;transportMode=binary:
 Invalid status 21 (state=08S01,code=0)

Looking at the Thriftserver logs reveals the following:

19/11/13 14:43:48 ERROR TThreadPoolServer: Error occurred during processing of 
message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: 
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at 
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
        at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:360)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1710)
        at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
        at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: 
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
        at 
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at 
org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at 
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at 
org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at 
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more
Caused by: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext 
connection?
        at 
sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
        at sun.security.ssl.InputRecord.read(InputRecord.java:527)
        at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:973)
        at 
sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
        at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:928)
        at sun.security.ssl.AppInputStream.read(AppInputStream.java:105)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 16 more
19/11/13 14:43:48 ERROR TThreadPoolServer: Error occurred during processing of 
message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: 
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at 
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)
        at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)
       at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:360)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1710)
        at 
org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:736)
        at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: 
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
        at 
org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at 
org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at 
org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at 
org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at 
org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more
Caused by: javax.net.ssl.SSLException: Unrecognized SSL message, plaintext 
connection?
        at 
sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
        at sun.security.ssl.InputRecord.read(InputRecord.java:527)
        at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:973)
        at 
sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
        at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:928)
        at sun.security.ssl.AppInputStream.read(AppInputStream.java:105)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 16 more

TLS is configured on the Hive side, but still it seems that instead of a TLS 
connection Beeline or Spark tries to establish a plaintext connection. Does 
anyone know how to resolve this problem? Besides the points mentioned above the 
Thriftserver, f.e. when connecting to it via ODBC.

Reply via email to