[ 
https://issues.apache.org/jira/browse/DRILL-5932?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

tooptoop4 updated DRILL-5932:
-----------------------------
    Description: 
Basic hive queries all throw error!

{code:sql}

copied 
https://repo1.maven.org/maven2/org/apache/hive/hive-jdbc/2.3.0/hive-jdbc-2.3.0-standalone.jar
 to /usr/lib/apache-drill-1.11.0/jars/3rdparty/hive-jdbc-2.3.0-standalone.jar

added this storage plugin:

{
  "type": "jdbc",
  "driver": "org.apache.hive.jdbc.HiveDriver",
  "url": "jdbc:hive2://host:10000/default",
  "username": "hive",
  "password": "hive1234",
  "enabled": true
}

[ec2-user@host ~]$ cd /usr/lib/apache-drill-1.11.0
[ec2-user@host apache-drill-1.11.0]$ ./bin/drill-embedded
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was 
removed in 8.0
Nov 01, 2017 7:53:53 AM org.glassfish.jersey.server.ApplicationHandler 
initialize
INFO: Initiating Jersey application, version Jersey: 2.8 2014-04-29 01:25:26...
apache drill 1.11.0
"this isn't your grandfather's sql"
0: jdbc:drill:zk=local> SELECT count(*) FROM hive2.`contact`;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT COUNT(*) AS EXPR$0
FROM (SELECT 0 AS $f0
FROM.default.contact) AS t
plugin hive2
Fragment 0:0

[Error Id: 4b293e97-7547-49c5-91da-b9ee2f2184fc on 
ip-myip.mydomain.orghere.com:31010] (state=,code=0)




0: jdbc:drill:zk=local> ALTER SESSION SET `exec.errors.verbose` = true;
+-------+-------------------------------+
|  ok   |            summary            |
+-------+-------------------------------+
| true  | exec.errors.verbose updated.  |
+-------+-------------------------------+
1 row selected (0.351 seconds)

0: jdbc:drill:zk=local> select * from hive2.contact;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT *
FROM.default.contact
plugin hive2
Fragment 0:0

[Error Id: fe36b026-e8ff-4354-af6c-6073130680c9 on ip-ip.domain.org.com:31010]

  (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
statement: FAILED: ParseException line 2:4 cannot recognize input near '.' 
'default' '.' in join source
    org.apache.hive.jdbc.Utils.verifySuccess():267
    org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
    org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
    org.apache.hive.jdbc.HiveStatement.execute():253
    org.apache.hive.jdbc.HiveStatement.executeQuery():476
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
    org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
compiling statement: FAILED: ParseException line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hive.service.cli.operation.Operation.toSQLException():380
    org.apache.hive.service.cli.operation.SQLOperation.prepare():206
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.RuntimeException) 
org.apache.hadoop.hive.ql.parse.ParseException:line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hadoop.hive.ql.parse.ParseDriver.parse():207
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
    org.apache.hadoop.hive.ql.Driver.compile():468
    org.apache.hadoop.hive.ql.Driver.compileInternal():1316
    org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
    org.apache.hive.service.cli.operation.SQLOperation.prepare():204
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748 (state=,code=0)
0: jdbc:drill:zk=local> select * from hive2.`default`.contact;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT *
FROM.default.contact
plugin hive2
Fragment 0:0

[Error Id: 7b7aae92-fabd-49dd-ab84-15ca00189882 on ip-ip.domain.org.com:31010]

  (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
statement: FAILED: ParseException line 2:4 cannot recognize input near '.' 
'default' '.' in join source
    org.apache.hive.jdbc.Utils.verifySuccess():267
    org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
    org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
    org.apache.hive.jdbc.HiveStatement.execute():253
    org.apache.hive.jdbc.HiveStatement.executeQuery():476
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
    org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
compiling statement: FAILED: ParseException line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hive.service.cli.operation.Operation.toSQLException():380
    org.apache.hive.service.cli.operation.SQLOperation.prepare():206
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.RuntimeException) 
org.apache.hadoop.hive.ql.parse.ParseException:line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hadoop.hive.ql.parse.ParseDriver.parse():207
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
    org.apache.hadoop.hive.ql.Driver.compile():468
    org.apache.hadoop.hive.ql.Driver.compileInternal():1316
    org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
    org.apache.hive.service.cli.operation.SQLOperation.prepare():204
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748 (state=,code=0)
0: jdbc:drill:zk=local>


0: jdbc:drill:zk=local> SELECT count(*) FROM hive2.`contact`;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT COUNT(*) AS EXPR$0
FROM (SELECT 0 AS $f0
FROM.default.contact) AS t
plugin hive2
Fragment 0:0

[Error Id: ac5cc8f2-69a4-455e-b1a6-5d22cde99729 on 
ip-myip.mydomain.orghere.com:31010]

  (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
statement: FAILED: ParseException line 1:23 missing EOF at '$' near 'EXPR'
    org.apache.hive.jdbc.Utils.verifySuccess():267
    org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
    org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
    org.apache.hive.jdbc.HiveStatement.execute():253
    org.apache.hive.jdbc.HiveStatement.executeQuery():476
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
    org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
compiling statement: FAILED: ParseException line 1:23 missing EOF at '$' near 
'EXPR'
    org.apache.hive.service.cli.operation.Operation.toSQLException():380
    org.apache.hive.service.cli.operation.SQLOperation.prepare():206
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.RuntimeException) 
org.apache.hadoop.hive.ql.parse.ParseException:line 1:23 missing EOF at '$' 
near 'EXPR'
    org.apache.hadoop.hive.ql.parse.ParseDriver.parse():215
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
    org.apache.hadoop.hive.ql.Driver.compile():468
    org.apache.hadoop.hive.ql.Driver.compileInternal():1316
    org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
    org.apache.hive.service.cli.operation.SQLOperation.prepare():204
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748 (state=,code=0)
0: jdbc:drill:zk=local>


{
    "id": {
        "part1": 2738948259170077000,
        "part2": -61360705522985576
    },
    "type": 1,
    "start": 1509772561834,
    "end": 1509772562065,
    "query": "SELECT count(*) FROM hive2.`contact`",
    "plan": "00-00    Screen : rowType = RecordType(BIGINT EXPR$0): rowcount = 
1.0, cumulative cost = {1.1 rows, 1.1 cpu, 0.0 io, 0.0 network, 0.0 memory}, id 
= 882\n00-01      Project(EXPR$0=[$0]) : rowType = RecordType(BIGINT EXPR$0): 
rowcount = 1.0, cumulative cost = {1.0 rows, 1.0 cpu, 0.0 io, 0.0 network, 0.0 
memory}, id = 881\n00-02        Jdbc(sql=[SELECT COUNT(*) AS EXPR$0\nFROM 
(SELECT 0 AS $f0\nFROM.default.contact) AS t]) : rowType = RecordType(BIGINT 
EXPR$0): rowcount = 1.0, cumulative cost = {1.0 rows, 1.0 cpu, 0.0 io, 0.0 
network, 0.0 memory}, id = 846\n",
    "foreman": {
        "address": "ip-myip.mydomain.orghere.com",
        "userPort": 31010,
        "controlPort": 31011,
        "dataPort": 31012,
        "version": "1.11.0"
    },
    "state": 4,
    "totalFragments": 1,
    "finishedFragments": 0,
    "fragmentProfile": [
        {
            "majorFragmentId": 0,
            "minorFragmentProfile": [
                {
                    "state": 0,
                    "minorFragmentId": 0,
                    "endpoint": {
                        "address": "ip-myip.mydomain.orghere.com",
                        "userPort": 31010,
                        "controlPort": 31011,
                        "dataPort": 31012,
                        "version": "1.11.0"
                    },
                    "lastUpdate": 1509772562054,
                    "lastProgress": 1509772562054
                }
            ]
        }
    ],
    "user": "anonymous",
    "error": "DATA_READ ERROR: Drill Remote Exception\n\n",
    "verboseError": "DATA_READ ERROR: Drill Remote Exception\n\n\n\n",
    "errorId": "fd38ddaa-53d3-46cb-b8e1-4bc8fef458d4",
    "errorNode": "ip-myip.mydomain.orghere.com:31010",
    "optionsJson": "[ {\n  \"kind\" : \"BOOLEAN\",\n  \"type\" : \"SESSION\",\n 
 \"name\" : \"exec.errors.verbose\",\n  \"bool_val\" : true\n} ]",
    "planEnd": 1509772562051,
    "queueWaitEnd": 0
}



0: jdbc:drill:zk=local> SELECT gender FROM hive2.`contact` limit 1;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT *
FROM.default.contact
plugin hive2
Fragment 0:0


To prove that it can connect, here is the query plan, showing all columns from 
the table:


00-00    Screen : rowType = RecordType(VARCHAR(1) gender): rowcount = 1.0, 
cumulative cost = {103.1 rows, 109.1 cpu, 0.0 io, 0.0 network, 0.0 memory}, id 
= 354
00-01      Project(gender=[$0]) : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {103.0 rows, 109.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 353
00-02        SelectionVectorRemover : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {103.0 rows, 109.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 352
00-03          Limit(fetch=[1]) : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {102.0 rows, 108.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 351
00-04            Limit(fetch=[1]) : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {101.0 rows, 104.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 350
00-05              Project(gender=[$9]) : rowType = RecordType(VARCHAR(1) 
gender): rowcount = 100.0, cumulative cost = {100.0 rows, 100.0 cpu, 0.0 io, 
0.0 network, 0.0 memory}, id = 349
00-06                Jdbc(sql=[SELECT *
FROM.default.contact]) : rowType = RecordType(VARCHAR(50) contact_id, 
VARCHAR(50) first_name, VARCHAR(50) last_name, VARCHAR(100) address_line_1, 
VARCHAR(100) address_line_2, VARCHAR(50) city, VARCHAR(10) postcode, 
VARCHAR(100) email, DATE dob, VARCHAR(1) gender, VARCHAR(1) marital_status, 
VARCHAR(20) tfn): rowcount = 100.0, cumulative cost = {100.0 rows, 100.0 cpu, 
0.0 io, 0.0 network, 0.0 memory}, id = 282


Below connection with beeline proves there is data in the table:

Beeline version 2.3.0 by Apache Hive
0: jdbc:hive2://localhost:10000/default> show create table contact;
+----------------------------------------------------+
|                   createtab_stmt                   |
+----------------------------------------------------+
| CREATE TABLE `contact`(                            |
|   `contact_id` varchar(50),                        |
|   `first_name` varchar(50),                        |
|   `last_name` varchar(50),                         |
|   `address_line_1` varchar(100),                   |
|   `address_line_2` varchar(100),                   |
|   `city` varchar(50),                              |
|   `postcode` varchar(10),                          |
|   `email` varchar(100),                            |
|   `dob` date,                                      |
|   `gender` varchar(1),                             |
|   `marital_status` varchar(1),                     |
|   `tfn` varchar(20))                               |
| ROW FORMAT SERDE                                   |
|   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
| STORED AS INPUTFORMAT                              |
|   'org.apache.hadoop.mapred.TextInputFormat'       |
| OUTPUTFORMAT                                       |
|   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' |
| LOCATION                                           |
|   'file:/home/ec2-user/warehouse/contact'          |
| TBLPROPERTIES (                                    |
|   'transient_lastDdlTime'='1509487563')            |
+----------------------------------------------------+
23 rows selected (0.361 seconds)
0: jdbc:hive2://localhost:10000/default> select * from contact;
+---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
| contact.contact_id  | contact.first_name  | contact.last_name  | 
contact.address_line_1  | contact.address_line_2  | contact.city  | 
contact.postcode  |   contact.email    | contact.dob  | contact.gender  | 
contact.marital_status  | contact.tfn  |
+---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
| C1                  | Bob                 | B                  | St           
           |                         | Ta            | 2100              | 
b...@b.com          | 2020-03-01   | M               | M                       
| 12           |
+---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
1 rows selected (0.156 seconds)
0: jdbc:hive2://localhost:10000/default>

{code}


  was:
Basic hive queries all throw error!

{code:sql}

copied 
https://repo1.maven.org/maven2/org/apache/hive/hive-jdbc/2.3.0/hive-jdbc-2.3.0-standalone.jar
 to /usr/lib/apache-drill-1.11.0/jars/3rdparty/hive-jdbc-2.3.0-standalone.jar

added this storage plugin:

{
  "type": "jdbc",
  "driver": "org.apache.hive.jdbc.HiveDriver",
  "url": "jdbc:hive2://host:10000/default",
  "username": "hive",
  "password": "hive1234",
  "enabled": true
}

[ec2-user@host ~]$ cd /usr/lib/apache-drill-1.11.0
[ec2-user@host apache-drill-1.11.0]$ ./bin/drill-embedded
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was 
removed in 8.0
Nov 01, 2017 7:53:53 AM org.glassfish.jersey.server.ApplicationHandler 
initialize
INFO: Initiating Jersey application, version Jersey: 2.8 2014-04-29 01:25:26...
apache drill 1.11.0
"this isn't your grandfather's sql"
0: jdbc:drill:zk=local> SELECT count(*) FROM hive2.`contact`;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT COUNT(*) AS EXPR$0
FROM (SELECT 0 AS $f0
FROM.default.contact) AS t
plugin hive2
Fragment 0:0

[Error Id: 4b293e97-7547-49c5-91da-b9ee2f2184fc on 
ip-myip.mydomain.orghere.com:31010] (state=,code=0)




0: jdbc:drill:zk=local> ALTER SESSION SET `exec.errors.verbose` = true;
+-------+-------------------------------+
|  ok   |            summary            |
+-------+-------------------------------+
| true  | exec.errors.verbose updated.  |
+-------+-------------------------------+
1 row selected (0.351 seconds)

0: jdbc:drill:zk=local> select * from hive2.contact;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT *
FROM.default.contact
plugin hive2
Fragment 0:0

[Error Id: fe36b026-e8ff-4354-af6c-6073130680c9 on 
ip-10-160-153-163.syd.non.c1.macquarie.com:31010]

  (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
statement: FAILED: ParseException line 2:4 cannot recognize input near '.' 
'default' '.' in join source
    org.apache.hive.jdbc.Utils.verifySuccess():267
    org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
    org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
    org.apache.hive.jdbc.HiveStatement.execute():253
    org.apache.hive.jdbc.HiveStatement.executeQuery():476
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
    org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
compiling statement: FAILED: ParseException line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hive.service.cli.operation.Operation.toSQLException():380
    org.apache.hive.service.cli.operation.SQLOperation.prepare():206
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.RuntimeException) 
org.apache.hadoop.hive.ql.parse.ParseException:line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hadoop.hive.ql.parse.ParseDriver.parse():207
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
    org.apache.hadoop.hive.ql.Driver.compile():468
    org.apache.hadoop.hive.ql.Driver.compileInternal():1316
    org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
    org.apache.hive.service.cli.operation.SQLOperation.prepare():204
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748 (state=,code=0)
0: jdbc:drill:zk=local> select * from hive2.`default`.contact;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT *
FROM.default.contact
plugin hive2
Fragment 0:0

[Error Id: 7b7aae92-fabd-49dd-ab84-15ca00189882 on 
ip-10-160-153-163.syd.non.c1.macquarie.com:31010]

  (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
statement: FAILED: ParseException line 2:4 cannot recognize input near '.' 
'default' '.' in join source
    org.apache.hive.jdbc.Utils.verifySuccess():267
    org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
    org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
    org.apache.hive.jdbc.HiveStatement.execute():253
    org.apache.hive.jdbc.HiveStatement.executeQuery():476
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
    org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
compiling statement: FAILED: ParseException line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hive.service.cli.operation.Operation.toSQLException():380
    org.apache.hive.service.cli.operation.SQLOperation.prepare():206
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.RuntimeException) 
org.apache.hadoop.hive.ql.parse.ParseException:line 2:4 cannot recognize input 
near '.' 'default' '.' in join source
    org.apache.hadoop.hive.ql.parse.ParseDriver.parse():207
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
    org.apache.hadoop.hive.ql.Driver.compile():468
    org.apache.hadoop.hive.ql.Driver.compileInternal():1316
    org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
    org.apache.hive.service.cli.operation.SQLOperation.prepare():204
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748 (state=,code=0)
0: jdbc:drill:zk=local>


0: jdbc:drill:zk=local> SELECT count(*) FROM hive2.`contact`;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT COUNT(*) AS EXPR$0
FROM (SELECT 0 AS $f0
FROM.default.contact) AS t
plugin hive2
Fragment 0:0

[Error Id: ac5cc8f2-69a4-455e-b1a6-5d22cde99729 on 
ip-myip.mydomain.orghere.com:31010]

  (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
statement: FAILED: ParseException line 1:23 missing EOF at '$' near 'EXPR'
    org.apache.hive.jdbc.Utils.verifySuccess():267
    org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
    org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
    org.apache.hive.jdbc.HiveStatement.execute():253
    org.apache.hive.jdbc.HiveStatement.executeQuery():476
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
    org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
    org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
    org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
    org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
    org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
    org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
compiling statement: FAILED: ParseException line 1:23 missing EOF at '$' near 
'EXPR'
    org.apache.hive.service.cli.operation.Operation.toSQLException():380
    org.apache.hive.service.cli.operation.SQLOperation.prepare():206
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.RuntimeException) 
org.apache.hadoop.hive.ql.parse.ParseException:line 1:23 missing EOF at '$' 
near 'EXPR'
    org.apache.hadoop.hive.ql.parse.ParseDriver.parse():215
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
    org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
    org.apache.hadoop.hive.ql.Driver.compile():468
    org.apache.hadoop.hive.ql.Driver.compileInternal():1316
    org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
    org.apache.hive.service.cli.operation.SQLOperation.prepare():204
    org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
    org.apache.hive.service.cli.operation.Operation.run():320
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
    
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
    sun.reflect.GeneratedMethodAccessor86.invoke():-1
    sun.reflect.DelegatingMethodAccessorImpl.invoke():43
    java.lang.reflect.Method.invoke():498
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
    org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
    org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1807
    org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
    com.sun.proxy.$Proxy42.executeStatementAsync():-1
    org.apache.hive.service.cli.CLIService.executeStatementAsync():310
    org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
    
org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
    org.apache.thrift.ProcessFunction.process():39
    org.apache.thrift.TBaseProcessor.process():39
    org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
    org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748 (state=,code=0)
0: jdbc:drill:zk=local>


{
    "id": {
        "part1": 2738948259170077000,
        "part2": -61360705522985576
    },
    "type": 1,
    "start": 1509772561834,
    "end": 1509772562065,
    "query": "SELECT count(*) FROM hive2.`contact`",
    "plan": "00-00    Screen : rowType = RecordType(BIGINT EXPR$0): rowcount = 
1.0, cumulative cost = {1.1 rows, 1.1 cpu, 0.0 io, 0.0 network, 0.0 memory}, id 
= 882\n00-01      Project(EXPR$0=[$0]) : rowType = RecordType(BIGINT EXPR$0): 
rowcount = 1.0, cumulative cost = {1.0 rows, 1.0 cpu, 0.0 io, 0.0 network, 0.0 
memory}, id = 881\n00-02        Jdbc(sql=[SELECT COUNT(*) AS EXPR$0\nFROM 
(SELECT 0 AS $f0\nFROM.default.contact) AS t]) : rowType = RecordType(BIGINT 
EXPR$0): rowcount = 1.0, cumulative cost = {1.0 rows, 1.0 cpu, 0.0 io, 0.0 
network, 0.0 memory}, id = 846\n",
    "foreman": {
        "address": "ip-myip.mydomain.orghere.com",
        "userPort": 31010,
        "controlPort": 31011,
        "dataPort": 31012,
        "version": "1.11.0"
    },
    "state": 4,
    "totalFragments": 1,
    "finishedFragments": 0,
    "fragmentProfile": [
        {
            "majorFragmentId": 0,
            "minorFragmentProfile": [
                {
                    "state": 0,
                    "minorFragmentId": 0,
                    "endpoint": {
                        "address": "ip-myip.mydomain.orghere.com",
                        "userPort": 31010,
                        "controlPort": 31011,
                        "dataPort": 31012,
                        "version": "1.11.0"
                    },
                    "lastUpdate": 1509772562054,
                    "lastProgress": 1509772562054
                }
            ]
        }
    ],
    "user": "anonymous",
    "error": "DATA_READ ERROR: Drill Remote Exception\n\n",
    "verboseError": "DATA_READ ERROR: Drill Remote Exception\n\n\n\n",
    "errorId": "fd38ddaa-53d3-46cb-b8e1-4bc8fef458d4",
    "errorNode": "ip-myip.mydomain.orghere.com:31010",
    "optionsJson": "[ {\n  \"kind\" : \"BOOLEAN\",\n  \"type\" : \"SESSION\",\n 
 \"name\" : \"exec.errors.verbose\",\n  \"bool_val\" : true\n} ]",
    "planEnd": 1509772562051,
    "queueWaitEnd": 0
}



0: jdbc:drill:zk=local> SELECT gender FROM hive2.`contact` limit 1;
Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
SQL query.

sql SELECT *
FROM.default.contact
plugin hive2
Fragment 0:0


To prove that it can connect, here is the query plan, showing all columns from 
the table:


00-00    Screen : rowType = RecordType(VARCHAR(1) gender): rowcount = 1.0, 
cumulative cost = {103.1 rows, 109.1 cpu, 0.0 io, 0.0 network, 0.0 memory}, id 
= 354
00-01      Project(gender=[$0]) : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {103.0 rows, 109.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 353
00-02        SelectionVectorRemover : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {103.0 rows, 109.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 352
00-03          Limit(fetch=[1]) : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {102.0 rows, 108.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 351
00-04            Limit(fetch=[1]) : rowType = RecordType(VARCHAR(1) gender): 
rowcount = 1.0, cumulative cost = {101.0 rows, 104.0 cpu, 0.0 io, 0.0 network, 
0.0 memory}, id = 350
00-05              Project(gender=[$9]) : rowType = RecordType(VARCHAR(1) 
gender): rowcount = 100.0, cumulative cost = {100.0 rows, 100.0 cpu, 0.0 io, 
0.0 network, 0.0 memory}, id = 349
00-06                Jdbc(sql=[SELECT *
FROM.default.contact]) : rowType = RecordType(VARCHAR(50) contact_id, 
VARCHAR(50) first_name, VARCHAR(50) last_name, VARCHAR(100) address_line_1, 
VARCHAR(100) address_line_2, VARCHAR(50) city, VARCHAR(10) postcode, 
VARCHAR(100) email, DATE dob, VARCHAR(1) gender, VARCHAR(1) marital_status, 
VARCHAR(20) tfn): rowcount = 100.0, cumulative cost = {100.0 rows, 100.0 cpu, 
0.0 io, 0.0 network, 0.0 memory}, id = 282


Below connection with beeline proves there is data in the table:

Beeline version 2.3.0 by Apache Hive
0: jdbc:hive2://localhost:10000/default> show create table contact;
+----------------------------------------------------+
|                   createtab_stmt                   |
+----------------------------------------------------+
| CREATE TABLE `contact`(                            |
|   `contact_id` varchar(50),                        |
|   `first_name` varchar(50),                        |
|   `last_name` varchar(50),                         |
|   `address_line_1` varchar(100),                   |
|   `address_line_2` varchar(100),                   |
|   `city` varchar(50),                              |
|   `postcode` varchar(10),                          |
|   `email` varchar(100),                            |
|   `dob` date,                                      |
|   `gender` varchar(1),                             |
|   `marital_status` varchar(1),                     |
|   `tfn` varchar(20))                               |
| ROW FORMAT SERDE                                   |
|   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
| STORED AS INPUTFORMAT                              |
|   'org.apache.hadoop.mapred.TextInputFormat'       |
| OUTPUTFORMAT                                       |
|   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' |
| LOCATION                                           |
|   'file:/home/ec2-user/warehouse/contact'          |
| TBLPROPERTIES (                                    |
|   'transient_lastDdlTime'='1509487563')            |
+----------------------------------------------------+
23 rows selected (0.361 seconds)
0: jdbc:hive2://localhost:10000/default> select * from contact;
+---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
| contact.contact_id  | contact.first_name  | contact.last_name  | 
contact.address_line_1  | contact.address_line_2  | contact.city  | 
contact.postcode  |   contact.email    | contact.dob  | contact.gender  | 
contact.marital_status  | contact.tfn  |
+---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
| C1                  | Bob                 | B                  | St           
           |                         | Ta            | 2100              | 
b...@b.com          | 2020-03-01   | M               | M                       
| 12           |
+---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
1 rows selected (0.156 seconds)
0: jdbc:hive2://localhost:10000/default>

{code}



> HiveServer2 queries throw error with jdbc connection
> ----------------------------------------------------
>
>                 Key: DRILL-5932
>                 URL: https://issues.apache.org/jira/browse/DRILL-5932
>             Project: Apache Drill
>          Issue Type: Bug
>          Components: Client - JDBC, Storage - Hive
>    Affects Versions: 1.11.0
>         Environment: linux
> 2.3 hive version
>            Reporter: tooptoop4
>            Priority: Blocker
>
> Basic hive queries all throw error!
> {code:sql}
> copied 
> https://repo1.maven.org/maven2/org/apache/hive/hive-jdbc/2.3.0/hive-jdbc-2.3.0-standalone.jar
>  to /usr/lib/apache-drill-1.11.0/jars/3rdparty/hive-jdbc-2.3.0-standalone.jar
> added this storage plugin:
> {
>   "type": "jdbc",
>   "driver": "org.apache.hive.jdbc.HiveDriver",
>   "url": "jdbc:hive2://host:10000/default",
>   "username": "hive",
>   "password": "hive1234",
>   "enabled": true
> }
> [ec2-user@host ~]$ cd /usr/lib/apache-drill-1.11.0
> [ec2-user@host apache-drill-1.11.0]$ ./bin/drill-embedded
> OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support 
> was removed in 8.0
> Nov 01, 2017 7:53:53 AM org.glassfish.jersey.server.ApplicationHandler 
> initialize
> INFO: Initiating Jersey application, version Jersey: 2.8 2014-04-29 
> 01:25:26...
> apache drill 1.11.0
> "this isn't your grandfather's sql"
> 0: jdbc:drill:zk=local> SELECT count(*) FROM hive2.`contact`;
> Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
> SQL query.
> sql SELECT COUNT(*) AS EXPR$0
> FROM (SELECT 0 AS $f0
> FROM.default.contact) AS t
> plugin hive2
> Fragment 0:0
> [Error Id: 4b293e97-7547-49c5-91da-b9ee2f2184fc on 
> ip-myip.mydomain.orghere.com:31010] (state=,code=0)
> 0: jdbc:drill:zk=local> ALTER SESSION SET `exec.errors.verbose` = true;
> +-------+-------------------------------+
> |  ok   |            summary            |
> +-------+-------------------------------+
> | true  | exec.errors.verbose updated.  |
> +-------+-------------------------------+
> 1 row selected (0.351 seconds)
> 0: jdbc:drill:zk=local> select * from hive2.contact;
> Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
> SQL query.
> sql SELECT *
> FROM.default.contact
> plugin hive2
> Fragment 0:0
> [Error Id: fe36b026-e8ff-4354-af6c-6073130680c9 on ip-ip.domain.org.com:31010]
>   (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
> statement: FAILED: ParseException line 2:4 cannot recognize input near '.' 
> 'default' '.' in join source
>     org.apache.hive.jdbc.Utils.verifySuccess():267
>     org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
>     org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
>     org.apache.hive.jdbc.HiveStatement.execute():253
>     org.apache.hive.jdbc.HiveStatement.executeQuery():476
>     org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
>     org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
>     org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
>     org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
>     org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
>     org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
>     org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
>     org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
>     org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
>     org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
>     org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
>     org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
>     org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
>     org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
>     org.apache.drill.common.SelfCleaningRunnable.run():38
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
> compiling statement: FAILED: ParseException line 2:4 cannot recognize input 
> near '.' 'default' '.' in join source
>     org.apache.hive.service.cli.operation.Operation.toSQLException():380
>     org.apache.hive.service.cli.operation.SQLOperation.prepare():206
>     org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
>     org.apache.hive.service.cli.operation.Operation.run():320
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
>     sun.reflect.GeneratedMethodAccessor86.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
>     org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
>     org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
>     java.security.AccessController.doPrivileged():-2
>     javax.security.auth.Subject.doAs():422
>     org.apache.hadoop.security.UserGroupInformation.doAs():1807
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
>     com.sun.proxy.$Proxy42.executeStatementAsync():-1
>     org.apache.hive.service.cli.CLIService.executeStatementAsync():310
>     org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
>     org.apache.thrift.ProcessFunction.process():39
>     org.apache.thrift.TBaseProcessor.process():39
>     org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
>     org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (java.lang.RuntimeException) 
> org.apache.hadoop.hive.ql.parse.ParseException:line 2:4 cannot recognize 
> input near '.' 'default' '.' in join source
>     org.apache.hadoop.hive.ql.parse.ParseDriver.parse():207
>     org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
>     org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
>     org.apache.hadoop.hive.ql.Driver.compile():468
>     org.apache.hadoop.hive.ql.Driver.compileInternal():1316
>     org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
>     org.apache.hive.service.cli.operation.SQLOperation.prepare():204
>     org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
>     org.apache.hive.service.cli.operation.Operation.run():320
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
>     sun.reflect.GeneratedMethodAccessor86.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
>     org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
>     org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
>     java.security.AccessController.doPrivileged():-2
>     javax.security.auth.Subject.doAs():422
>     org.apache.hadoop.security.UserGroupInformation.doAs():1807
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
>     com.sun.proxy.$Proxy42.executeStatementAsync():-1
>     org.apache.hive.service.cli.CLIService.executeStatementAsync():310
>     org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
>     org.apache.thrift.ProcessFunction.process():39
>     org.apache.thrift.TBaseProcessor.process():39
>     org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
>     org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748 (state=,code=0)
> 0: jdbc:drill:zk=local> select * from hive2.`default`.contact;
> Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
> SQL query.
> sql SELECT *
> FROM.default.contact
> plugin hive2
> Fragment 0:0
> [Error Id: 7b7aae92-fabd-49dd-ab84-15ca00189882 on ip-ip.domain.org.com:31010]
>   (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
> statement: FAILED: ParseException line 2:4 cannot recognize input near '.' 
> 'default' '.' in join source
>     org.apache.hive.jdbc.Utils.verifySuccess():267
>     org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
>     org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
>     org.apache.hive.jdbc.HiveStatement.execute():253
>     org.apache.hive.jdbc.HiveStatement.executeQuery():476
>     org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
>     org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
>     org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
>     org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
>     org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
>     org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
>     org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
>     org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
>     org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
>     org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
>     org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
>     org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
>     org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
>     org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
>     org.apache.drill.common.SelfCleaningRunnable.run():38
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
> compiling statement: FAILED: ParseException line 2:4 cannot recognize input 
> near '.' 'default' '.' in join source
>     org.apache.hive.service.cli.operation.Operation.toSQLException():380
>     org.apache.hive.service.cli.operation.SQLOperation.prepare():206
>     org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
>     org.apache.hive.service.cli.operation.Operation.run():320
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
>     sun.reflect.GeneratedMethodAccessor86.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
>     org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
>     org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
>     java.security.AccessController.doPrivileged():-2
>     javax.security.auth.Subject.doAs():422
>     org.apache.hadoop.security.UserGroupInformation.doAs():1807
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
>     com.sun.proxy.$Proxy42.executeStatementAsync():-1
>     org.apache.hive.service.cli.CLIService.executeStatementAsync():310
>     org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
>     org.apache.thrift.ProcessFunction.process():39
>     org.apache.thrift.TBaseProcessor.process():39
>     org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
>     org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (java.lang.RuntimeException) 
> org.apache.hadoop.hive.ql.parse.ParseException:line 2:4 cannot recognize 
> input near '.' 'default' '.' in join source
>     org.apache.hadoop.hive.ql.parse.ParseDriver.parse():207
>     org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
>     org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
>     org.apache.hadoop.hive.ql.Driver.compile():468
>     org.apache.hadoop.hive.ql.Driver.compileInternal():1316
>     org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
>     org.apache.hive.service.cli.operation.SQLOperation.prepare():204
>     org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
>     org.apache.hive.service.cli.operation.Operation.run():320
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
>     sun.reflect.GeneratedMethodAccessor86.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
>     org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
>     org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
>     java.security.AccessController.doPrivileged():-2
>     javax.security.auth.Subject.doAs():422
>     org.apache.hadoop.security.UserGroupInformation.doAs():1807
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
>     com.sun.proxy.$Proxy42.executeStatementAsync():-1
>     org.apache.hive.service.cli.CLIService.executeStatementAsync():310
>     org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
>     org.apache.thrift.ProcessFunction.process():39
>     org.apache.thrift.TBaseProcessor.process():39
>     org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
>     org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748 (state=,code=0)
> 0: jdbc:drill:zk=local>
> 0: jdbc:drill:zk=local> SELECT count(*) FROM hive2.`contact`;
> Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
> SQL query.
> sql SELECT COUNT(*) AS EXPR$0
> FROM (SELECT 0 AS $f0
> FROM.default.contact) AS t
> plugin hive2
> Fragment 0:0
> [Error Id: ac5cc8f2-69a4-455e-b1a6-5d22cde99729 on 
> ip-myip.mydomain.orghere.com:31010]
>   (org.apache.hive.service.cli.HiveSQLException) Error while compiling 
> statement: FAILED: ParseException line 1:23 missing EOF at '$' near 'EXPR'
>     org.apache.hive.jdbc.Utils.verifySuccess():267
>     org.apache.hive.jdbc.Utils.verifySuccessWithInfo():253
>     org.apache.hive.jdbc.HiveStatement.runAsyncOnServer():313
>     org.apache.hive.jdbc.HiveStatement.execute():253
>     org.apache.hive.jdbc.HiveStatement.executeQuery():476
>     org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
>     org.apache.commons.dbcp.DelegatingStatement.executeQuery():208
>     org.apache.drill.exec.store.jdbc.JdbcRecordReader.setup():177
>     org.apache.drill.exec.physical.impl.ScanBatch.<init>():104
>     org.apache.drill.exec.physical.impl.ScanBatch.<init>():126
>     org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():40
>     org.apache.drill.exec.store.jdbc.JdbcBatchCreator.getBatch():33
>     org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():156
>     org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
>     org.apache.drill.exec.physical.impl.ImplCreator.getRecordBatch():136
>     org.apache.drill.exec.physical.impl.ImplCreator.getChildren():179
>     org.apache.drill.exec.physical.impl.ImplCreator.getRootExec():109
>     org.apache.drill.exec.physical.impl.ImplCreator.getExec():87
>     org.apache.drill.exec.work.fragment.FragmentExecutor.run():207
>     org.apache.drill.common.SelfCleaningRunnable.run():38
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (org.apache.hive.service.cli.HiveSQLException) Error while 
> compiling statement: FAILED: ParseException line 1:23 missing EOF at '$' near 
> 'EXPR'
>     org.apache.hive.service.cli.operation.Operation.toSQLException():380
>     org.apache.hive.service.cli.operation.SQLOperation.prepare():206
>     org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
>     org.apache.hive.service.cli.operation.Operation.run():320
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
>     sun.reflect.GeneratedMethodAccessor86.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
>     org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
>     org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
>     java.security.AccessController.doPrivileged():-2
>     javax.security.auth.Subject.doAs():422
>     org.apache.hadoop.security.UserGroupInformation.doAs():1807
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
>     com.sun.proxy.$Proxy42.executeStatementAsync():-1
>     org.apache.hive.service.cli.CLIService.executeStatementAsync():310
>     org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
>     org.apache.thrift.ProcessFunction.process():39
>     org.apache.thrift.TBaseProcessor.process():39
>     org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
>     org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748
>   Caused By (java.lang.RuntimeException) 
> org.apache.hadoop.hive.ql.parse.ParseException:line 1:23 missing EOF at '$' 
> near 'EXPR'
>     org.apache.hadoop.hive.ql.parse.ParseDriver.parse():215
>     org.apache.hadoop.hive.ql.parse.ParseUtils.parse():77
>     org.apache.hadoop.hive.ql.parse.ParseUtils.parse():70
>     org.apache.hadoop.hive.ql.Driver.compile():468
>     org.apache.hadoop.hive.ql.Driver.compileInternal():1316
>     org.apache.hadoop.hive.ql.Driver.compileAndRespond():1294
>     org.apache.hive.service.cli.operation.SQLOperation.prepare():204
>     org.apache.hive.service.cli.operation.SQLOperation.runInternal():290
>     org.apache.hive.service.cli.operation.Operation.run():320
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal():530
>     
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync():517
>     sun.reflect.GeneratedMethodAccessor86.invoke():-1
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():498
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():78
>     org.apache.hive.service.cli.session.HiveSessionProxy.access$000():36
>     org.apache.hive.service.cli.session.HiveSessionProxy$1.run():63
>     java.security.AccessController.doPrivileged():-2
>     javax.security.auth.Subject.doAs():422
>     org.apache.hadoop.security.UserGroupInformation.doAs():1807
>     org.apache.hive.service.cli.session.HiveSessionProxy.invoke():59
>     com.sun.proxy.$Proxy42.executeStatementAsync():-1
>     org.apache.hive.service.cli.CLIService.executeStatementAsync():310
>     org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement():530
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1437
>     
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult():1422
>     org.apache.thrift.ProcessFunction.process():39
>     org.apache.thrift.TBaseProcessor.process():39
>     org.apache.hive.service.auth.TSetIpAddressProcessor.process():56
>     org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run():286
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1149
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():624
>     java.lang.Thread.run():748 (state=,code=0)
> 0: jdbc:drill:zk=local>
> {
>     "id": {
>         "part1": 2738948259170077000,
>         "part2": -61360705522985576
>     },
>     "type": 1,
>     "start": 1509772561834,
>     "end": 1509772562065,
>     "query": "SELECT count(*) FROM hive2.`contact`",
>     "plan": "00-00    Screen : rowType = RecordType(BIGINT EXPR$0): rowcount 
> = 1.0, cumulative cost = {1.1 rows, 1.1 cpu, 0.0 io, 0.0 network, 0.0 
> memory}, id = 882\n00-01      Project(EXPR$0=[$0]) : rowType = 
> RecordType(BIGINT EXPR$0): rowcount = 1.0, cumulative cost = {1.0 rows, 1.0 
> cpu, 0.0 io, 0.0 network, 0.0 memory}, id = 881\n00-02        
> Jdbc(sql=[SELECT COUNT(*) AS EXPR$0\nFROM (SELECT 0 AS 
> $f0\nFROM.default.contact) AS t]) : rowType = RecordType(BIGINT EXPR$0): 
> rowcount = 1.0, cumulative cost = {1.0 rows, 1.0 cpu, 0.0 io, 0.0 network, 
> 0.0 memory}, id = 846\n",
>     "foreman": {
>         "address": "ip-myip.mydomain.orghere.com",
>         "userPort": 31010,
>         "controlPort": 31011,
>         "dataPort": 31012,
>         "version": "1.11.0"
>     },
>     "state": 4,
>     "totalFragments": 1,
>     "finishedFragments": 0,
>     "fragmentProfile": [
>         {
>             "majorFragmentId": 0,
>             "minorFragmentProfile": [
>                 {
>                     "state": 0,
>                     "minorFragmentId": 0,
>                     "endpoint": {
>                         "address": "ip-myip.mydomain.orghere.com",
>                         "userPort": 31010,
>                         "controlPort": 31011,
>                         "dataPort": 31012,
>                         "version": "1.11.0"
>                     },
>                     "lastUpdate": 1509772562054,
>                     "lastProgress": 1509772562054
>                 }
>             ]
>         }
>     ],
>     "user": "anonymous",
>     "error": "DATA_READ ERROR: Drill Remote Exception\n\n",
>     "verboseError": "DATA_READ ERROR: Drill Remote Exception\n\n\n\n",
>     "errorId": "fd38ddaa-53d3-46cb-b8e1-4bc8fef458d4",
>     "errorNode": "ip-myip.mydomain.orghere.com:31010",
>     "optionsJson": "[ {\n  \"kind\" : \"BOOLEAN\",\n  \"type\" : 
> \"SESSION\",\n  \"name\" : \"exec.errors.verbose\",\n  \"bool_val\" : true\n} 
> ]",
>     "planEnd": 1509772562051,
>     "queueWaitEnd": 0
> }
> 0: jdbc:drill:zk=local> SELECT gender FROM hive2.`contact` limit 1;
> Error: DATA_READ ERROR: The JDBC storage plugin failed while trying setup the 
> SQL query.
> sql SELECT *
> FROM.default.contact
> plugin hive2
> Fragment 0:0
> To prove that it can connect, here is the query plan, showing all columns 
> from the table:
> 00-00    Screen : rowType = RecordType(VARCHAR(1) gender): rowcount = 1.0, 
> cumulative cost = {103.1 rows, 109.1 cpu, 0.0 io, 0.0 network, 0.0 memory}, 
> id = 354
> 00-01      Project(gender=[$0]) : rowType = RecordType(VARCHAR(1) gender): 
> rowcount = 1.0, cumulative cost = {103.0 rows, 109.0 cpu, 0.0 io, 0.0 
> network, 0.0 memory}, id = 353
> 00-02        SelectionVectorRemover : rowType = RecordType(VARCHAR(1) 
> gender): rowcount = 1.0, cumulative cost = {103.0 rows, 109.0 cpu, 0.0 io, 
> 0.0 network, 0.0 memory}, id = 352
> 00-03          Limit(fetch=[1]) : rowType = RecordType(VARCHAR(1) gender): 
> rowcount = 1.0, cumulative cost = {102.0 rows, 108.0 cpu, 0.0 io, 0.0 
> network, 0.0 memory}, id = 351
> 00-04            Limit(fetch=[1]) : rowType = RecordType(VARCHAR(1) gender): 
> rowcount = 1.0, cumulative cost = {101.0 rows, 104.0 cpu, 0.0 io, 0.0 
> network, 0.0 memory}, id = 350
> 00-05              Project(gender=[$9]) : rowType = RecordType(VARCHAR(1) 
> gender): rowcount = 100.0, cumulative cost = {100.0 rows, 100.0 cpu, 0.0 io, 
> 0.0 network, 0.0 memory}, id = 349
> 00-06                Jdbc(sql=[SELECT *
> FROM.default.contact]) : rowType = RecordType(VARCHAR(50) contact_id, 
> VARCHAR(50) first_name, VARCHAR(50) last_name, VARCHAR(100) address_line_1, 
> VARCHAR(100) address_line_2, VARCHAR(50) city, VARCHAR(10) postcode, 
> VARCHAR(100) email, DATE dob, VARCHAR(1) gender, VARCHAR(1) marital_status, 
> VARCHAR(20) tfn): rowcount = 100.0, cumulative cost = {100.0 rows, 100.0 cpu, 
> 0.0 io, 0.0 network, 0.0 memory}, id = 282
> Below connection with beeline proves there is data in the table:
> Beeline version 2.3.0 by Apache Hive
> 0: jdbc:hive2://localhost:10000/default> show create table contact;
> +----------------------------------------------------+
> |                   createtab_stmt                   |
> +----------------------------------------------------+
> | CREATE TABLE `contact`(                            |
> |   `contact_id` varchar(50),                        |
> |   `first_name` varchar(50),                        |
> |   `last_name` varchar(50),                         |
> |   `address_line_1` varchar(100),                   |
> |   `address_line_2` varchar(100),                   |
> |   `city` varchar(50),                              |
> |   `postcode` varchar(10),                          |
> |   `email` varchar(100),                            |
> |   `dob` date,                                      |
> |   `gender` varchar(1),                             |
> |   `marital_status` varchar(1),                     |
> |   `tfn` varchar(20))                               |
> | ROW FORMAT SERDE                                   |
> |   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'  |
> | STORED AS INPUTFORMAT                              |
> |   'org.apache.hadoop.mapred.TextInputFormat'       |
> | OUTPUTFORMAT                                       |
> |   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' |
> | LOCATION                                           |
> |   'file:/home/ec2-user/warehouse/contact'          |
> | TBLPROPERTIES (                                    |
> |   'transient_lastDdlTime'='1509487563')            |
> +----------------------------------------------------+
> 23 rows selected (0.361 seconds)
> 0: jdbc:hive2://localhost:10000/default> select * from contact;
> +---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
> | contact.contact_id  | contact.first_name  | contact.last_name  | 
> contact.address_line_1  | contact.address_line_2  | contact.city  | 
> contact.postcode  |   contact.email    | contact.dob  | contact.gender  | 
> contact.marital_status  | contact.tfn  |
> +---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
> | C1                  | Bob                 | B                  | St         
>              |                         | Ta            | 2100              | 
> b...@b.com          | 2020-03-01   | M               | M                      
>  | 12           |
> +---------------------+---------------------+--------------------+-------------------------+-------------------------+---------------+-------------------+--------------------+--------------+-----------------+-------------------------+--------------+
> 1 rows selected (0.156 seconds)
> 0: jdbc:hive2://localhost:10000/default>
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to