Hi Team,

Need your help in resolving this. I am struggling from last many days for
this.


I am getting error(ERROR hive.log - Got exception:
org.apache.thrift.transport.TTransportException java.net.SocketException:
Broken pipe (Write failed) while trying to connect Drill to Hive. For Hive
Microsoft HDInsight (Remote metastore (MS SQL Server)) is getting used and
for Drill I am using other VM which is under same VNet as cluster. I am
able to make Drill Storage plugin with below configuration


{

  "type": "hive",

  "enabled": true,

  "configProps": {

    "hive.metastore.uris": "thrift://hn0-xyz.cloudapp.net:9083,thrift://
hn1-xyz.cloudapp.net:9083",

    "hive.metastore.warehouse.dir": "/hive/warehouse",

    "fs.default.name": "wasb://qwerty @demo.blob.core.windows.net",

    "hive.metastore.sasl.enabled": "false"

  }

}



Stack Trace of error: PFA


core-site.xml:

<configuration>

    <property>

        <name>fs.azure.account.keyprovider.kkhdistore.blob.core.windows.net
</name>

        <value>org.apache.hadoop.fs.azure.ShellDecryptionKeyProvider</value>

    </property>

    <property>

        <name>fs.azure.shellkeyprovider.script</name>


<value>/usr/lib/python2.7/dist-packages/hdinsight_common/decrypt.sh</value>

    </property>

    <property>

        <name>fs.azure.account.key.kkhdistore.blob.core.windows.net</name>

        <value>{COPY FROM CLUSTER core-site.xml}</value>

    </property>

    <property>

        <name>fs.AbstractFileSystem.wasb.impl</name>

        <value>org.apache.hadoop.fs.azure.Wasb</value>

    </property>

</configuration>



Regards
Uday Sharma
[email protected]
0: jdbc:drill:zk=local> use hive;
17:57:19.515 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException 
java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.<init>(HiveSchemaFactory.java:133)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:118)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:62)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 31 common frames omitted
17:57:19.523 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:19.524 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR 
hive.metastore - Unable to shutdown local metastore client
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
 ~[libfb303-0.9.2.jar:na]
        at 
com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430) 
~[libfb303-0.9.2.jar:na]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:492)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:215)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.<init>(HiveSchemaFactory.java:133)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:118)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:62)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 31 common frames omitted
17:57:19.560 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:739)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:727)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:220)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.<init>(HiveSchemaFactory.java:133)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:118)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:62)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
17:57:19.560 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:19.562 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException 
java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchemaNames(HiveSchemaFactory.java:175)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.setHolder(HiveSchemaFactory.java:162)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:120)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:62)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 31 common frames omitted
17:57:19.967 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:19.969 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR 
hive.metastore - Unable to shutdown local metastore client
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
 ~[libfb303-0.9.2.jar:na]
        at 
com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430) 
~[libfb303-0.9.2.jar:na]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:492)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:215)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchemaNames(HiveSchemaFactory.java:175)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.setHolder(HiveSchemaFactory.java:162)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:120)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:62)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 31 common frames omitted
17:57:20.026 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:739)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:727)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:220)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchemaNames(HiveSchemaFactory.java:175)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.setHolder(HiveSchemaFactory.java:162)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:120)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:62)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
17:57:20.026 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:20.065 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException 
java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.<init>(HiveSchemaFactory.java:133)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:118)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 33 common frames omitted
17:57:20.466 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:20.468 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR 
hive.metastore - Unable to shutdown local metastore client
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
 ~[libfb303-0.9.2.jar:na]
        at 
com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430) 
~[libfb303-0.9.2.jar:na]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:492)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:215)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.<init>(HiveSchemaFactory.java:133)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:118)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 33 common frames omitted
17:57:20.531 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:739)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:727)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:220)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.<init>(HiveSchemaFactory.java:133)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:118)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
17:57:20.923 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:20.925 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException 
java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchemaNames(HiveSchemaFactory.java:175)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.setHolder(HiveSchemaFactory.java:162)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:120)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 33 common frames omitted
17:57:20.925 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:20.933 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR 
hive.metastore - Unable to shutdown local metastore client
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
 ~[libfb303-0.9.2.jar:na]
        at 
com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430) 
~[libfb303-0.9.2.jar:na]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:492)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:215)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchemaNames(HiveSchemaFactory.java:175)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.setHolder(HiveSchemaFactory.java:162)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:120)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 33 common frames omitted
17:57:21.517 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:739)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:727)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:220)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchemaNames(HiveSchemaFactory.java:175)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.setHolder(HiveSchemaFactory.java:162)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory.registerSchemas(HiveSchemaFactory.java:120)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.HiveStoragePlugin.registerSchemas(HiveStoragePlugin.java:100)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas(StoragePluginRegistryImpl.java:365)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:72)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.SchemaTreeProvider.createRootSchema(SchemaTreeProvider.java:61)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:155) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getRootSchema(QueryContext.java:145) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema(QueryContext.java:131)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
17:57:21.517 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:21.526 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException 
java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:123)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.calcite.jdbc.SimpleCalciteSchema.getSubSchema(SimpleCalciteSchema.java:112)
 [calcite-core-1.4.0-drill-r19.jar:1.4.0-drill-r19]
        at 
org.apache.calcite.jdbc.CalciteAbstractSchema$SchemaPlusImpl.getSubSchema(CalciteAbstractSchema.java:194)
 [calcite-core-1.4.0-drill-r19.jar:1.4.0-drill-r19]
        at 
org.apache.drill.exec.planner.sql.SchemaUtilites.searchSchemaTree(SchemaUtilites.java:82)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.SchemaUtilites.findSchema(SchemaUtilites.java:49)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.rpc.user.UserSession.setDefaultSchemaPath(UserSession.java:176)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 30 common frames omitted
17:57:22.688 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
17:57:22.690 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR 
hive.metastore - Unable to shutdown local metastore client
org.apache.thrift.transport.TTransportException: java.net.SocketException: 
Broken pipe (Write failed)
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
 ~[libfb303-0.9.2.jar:na]
        at 
com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430) 
~[libfb303-0.9.2.jar:na]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:492)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:215)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:123)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.calcite.jdbc.SimpleCalciteSchema.getSubSchema(SimpleCalciteSchema.java:112)
 [calcite-core-1.4.0-drill-r19.jar:1.4.0-drill-r19]
        at 
org.apache.calcite.jdbc.CalciteAbstractSchema$SchemaPlusImpl.getSubSchema(CalciteAbstractSchema.java:194)
 [calcite-core-1.4.0-drill-r19.jar:1.4.0-drill-r19]
        at 
org.apache.drill.exec.planner.sql.SchemaUtilites.searchSchemaTree(SchemaUtilites.java:82)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.SchemaUtilites.findSchema(SchemaUtilites.java:49)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.rpc.user.UserSession.setDefaultSchemaPath(UserSession.java:176)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
Caused by: java.net.SocketException: Broken pipe (Write failed)
        at java.net.SocketOutputStream.socketWrite0(Native Method) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) 
~[na:1.7.0_121]
        at java.net.SocketOutputStream.write(SocketOutputStream.java:159) 
~[na:1.7.0_121]
        at 
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 
~[na:1.7.0_121]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 
~[na:1.7.0_121]
        at 
org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        ... 30 common frames omitted
17:57:22.716 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
        at 
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
 ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) 
~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:739)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:727)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031)
 ~[hive-metastore-1.2.1.jar:1.2.1]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:220)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:489)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$DatabaseLoader.load(DrillHiveMetaStoreClient.java:482)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 [guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 [guava-18.0.jar:na]
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.get(LocalCache.java:3937) 
[guava-18.0.jar:na]
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941) 
[guava-18.0.jar:na]
        at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824) 
[guava-18.0.jar:na]
        at 
org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient$HiveClientWithCaching.getDatabases(DrillHiveMetaStoreClient.java:449)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:139)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.store.hive.schema.HiveSchemaFactory$HiveSchema.getSubSchema(HiveSchemaFactory.java:123)
 [drill-storage-hive-core-1.9.0.jar:1.9.0]
        at 
org.apache.calcite.jdbc.SimpleCalciteSchema.getSubSchema(SimpleCalciteSchema.java:112)
 [calcite-core-1.4.0-drill-r19.jar:1.4.0-drill-r19]
        at 
org.apache.calcite.jdbc.CalciteAbstractSchema$SchemaPlusImpl.getSubSchema(CalciteAbstractSchema.java:194)
 [calcite-core-1.4.0-drill-r19.jar:1.4.0-drill-r19]
        at 
org.apache.drill.exec.planner.sql.SchemaUtilites.searchSchemaTree(SchemaUtilites.java:82)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.SchemaUtilites.findSchema(SchemaUtilites.java:49)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.rpc.user.UserSession.setDefaultSchemaPath(UserSession.java:176)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.handlers.UseSchemaHandler.getPlan(UseSchemaHandler.java:45)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPhysicalPlan(DrillSqlWorker.java:123)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at 
org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:97)
 [drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:1008) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:264) 
[drill-java-exec-1.9.0.jar:1.9.0]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
[na:1.7.0_121]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
[na:1.7.0_121]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_121]
17:57:22.716 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - 
Converting exception to MetaException
+-------+-----------------------------------+
|  ok   |              summary              |
+-------+-----------------------------------+
| true  | Default schema changed to [hive]  |
+-------+-----------------------------------+
1 row selected (3.379 seconds)

Reply via email to