[ 
https://issues.apache.org/jira/browse/FLINK-29839?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17627525#comment-17627525
 ] 

yuzelin commented on FLINK-29839:
---------------------------------

Hi, I have replied in mailing list. Please check out.

> HiveServer2 endpoint doesn't support TGetInfoType value 'CLI_ODBC_KEYWORDS'
> ---------------------------------------------------------------------------
>
>                 Key: FLINK-29839
>                 URL: https://issues.apache.org/jira/browse/FLINK-29839
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive, Table SQL / Gateway
>    Affects Versions: 1.16.0
>         Environment: Flink version: 1.16.0
> Hive version: 3.1.2
>            Reporter: Qizhu Chan
>            Priority: Critical
>
>  I had starting the SQL Gateway with the HiveServer2 Endpoint, and then I 
> submit SQL with Hive Beeline, but I get the following exception:
> {code:java}
> java.lang.UnsupportedOperationException: Unrecognized TGetInfoType value: 
> CLI_ODBC_KEYWORDS.
> at 
> org.apache.flink.table.endpoint.hive.HiveServer2Endpoint.GetInfo(HiveServer2Endpoint.java:371)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo.getResult(TCLIService.java:1537)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo.getResult(TCLIService.java:1522)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) 
> [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) 
> [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  [?:?]
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  [?:?]
> at java.lang.Thread.run(Thread.java:834) [?:?]
> 2022-11-01 13:55:33,885 ERROR org.apache.thrift.server.TThreadPoolServer      
>              [] - Thrift error occurred during processing of message.
> org.apache.thrift.protocol.TProtocolException: Required field 'infoValue' is 
> unset! Struct:TGetInfoResp(status:TStatus(statusCode:ERROR_STATUS, 
> infoMessages:[*java.lang.UnsupportedOperationException:Unrecognized 
> TGetInfoType value: CLI_ODBC_KEYWORDS.:9:8, 
> org.apache.flink.table.endpoint.hive.HiveServer2Endpoint:GetInfo:HiveServer2Endpoint.java:371,
>  
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo:getResult:TCLIService.java:1537,
>  
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo:getResult:TCLIService.java:1522,
>  org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, 
> org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286,
>  
> java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1128,
>  
> java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:628,
>  java.lang.Thread:run:Thread.java:834], errorMessage:Unrecognized 
> TGetInfoType value: CLI_ODBC_KEYWORDS.), infoValue:null)
> at 
> org.apache.hive.service.rpc.thrift.TGetInfoResp.validate(TGetInfoResp.java:379)
>  ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result.validate(TCLIService.java:5228)
>  ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result$GetInfo_resultStandardScheme.write(TCLIService.java:5285)
>  ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result$GetInfo_resultStandardScheme.write(TCLIService.java:5254)
>  ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result.write(TCLIService.java:5205)
>  ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:53) 
> ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) 
> ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  [?:?]
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  [?:?]
> at java.lang.Thread.run(Thread.java:834) [?:?]
> 2022-11-01 13:55:33,886 WARN  org.apache.thrift.transport.TIOStreamTransport  
>              [] - Error closing output stream.
> java.net.SocketException: Socket closed
> at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) ~[?:?]
> at java.net.SocketOutputStream.write(SocketOutputStream.java:150) ~[?:?]
> at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:81) 
> ~[?:?]
> at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:142) ~[?:?]
> at java.io.FilterOutputStream.close(FilterOutputStream.java:182) ~[?:?]
> at 
> org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at org.apache.thrift.transport.TSocket.close(TSocket.java:235) 
> [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:303)
>  [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0]
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  [?:?]
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  [?:?]
> at java.lang.Thread.run(Thread.java:834) [?:?]
> {code}
>  I found the following code is where the exception is thrown, but I don’t 
> know how to solve it.
> {code:java}
> public TGetInfoResp GetInfo(TGetInfoReq tGetInfoReq) throws TException {
>     TGetInfoResp resp = new TGetInfoResp();
> try {
>         GatewayInfo info = service.getGatewayInfo();
> TGetInfoValue tInfoValue;
> switch (tGetInfoReq.getInfoType()) {
>             case CLI_SERVER_NAME:
>             case CLI_DBMS_NAME:
>                 tInfoValue = TGetInfoValue.stringValue(info.getProductName());
>              break;
>             case CLI_DBMS_VER:
>                 tInfoValue = 
> TGetInfoValue.stringValue(info.getVersion().toString());
>              break;
>             default:
>                 throw new UnsupportedOperationException(
>                         String.format(
>                                 "Unrecognized TGetInfoType value: %s.",
> tGetInfoReq.getInfoType()));
> }
>         resp.setStatus(OK_STATUS);
> resp.setInfoValue(tInfoValue);
> } catch (Throwable t) {
>         LOG.error("Failed to GetInfo.", t);
> resp.setStatus(toTStatus(t));
> }
>     return resp;
> }
> {code}
> CLI_ODBC_KEYWORDS----- It's a new definition in the TGetInfoType enumeration 
> class in the Hive dependency package. It seems to be in a high version, but 
> my Hive environment and Flink reference connector should be 3.1.2, which is 
> available to the enumeration Definition, but from the source code of 
> Flink-1.16.0, only three enumerations of TGetInfoType are supported: 
> CLI_SERVER_NAME, CLI_DBMS_NAME, CLI_DBMS_VER.
> Is it a problem that Flink-1.16.0 does not support, or is there something 
> wrong with my environment or configuration? How can i fix it?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to