The error seems to be thrown by hdfs, when Drill tries to get list of
file status.  This might be caused by incompatibility of hdfs version.
Per Drill's doc [1] , HDFS 2.3 + API is required. Did you say you are
running on hdfs 1.0.4?

"
Hadoop: All Hadoop distributions (HDFS API 2.3+),  ....
"



[1] https://drill.apache.org/faq/


On Thu, Jun 16, 2016 at 10:24 AM, Tao,Mingyuan <taomingy...@baidu.com> wrote:
> This is my hdfs namenode:
> NameNode '10.207.78.21:38234'
> Started: Mon Feb 02 19:16:43 CST 2015
> Version: 1.0.4, r1393290
>
> This is the config of my file system plugin `rpmp`:
> {
>   "type": "file",
>   "enabled": true,
>   "connection": "hdfs://10.207.78.21:38234",
>   "config": null,
>   "workspaces": {
>     "root": {
>       "location": "/",
>       "writable": false,
>       "defaultInputFormat": null
>     },
>     "tmp": {
>       "location": "/tmp",
>       "writable": true,
>       "defaultInputFormat": null
>     }
>   },
>   "formats": {
>     "json": {
>       "type": "json",
>       "extensions": [
>         "json"
>       ]
>     }
>   }
> }
>
> This is my test data:
> bash-4.3$ hadoop fs -cat /test
> {"key": "value"}
>
> And Drill(embedded mode) failed executing query:
> 0: jdbc:drill:zk=local> SELECT * FROM rpmp.`/test` LIMIT 20;
> SYSTEM ERROR: EOFException
>
>   (org.apache.drill.exec.work.foreman.ForemanException) Unexpected exception 
> during fragment initialization: Failed to create schema tree: End of File 
> Exception between local host is: "host1.com/10.95.112.80"; destination host 
> is: " host2.com":38234; : java.io.EOFException; For more details see:  
> http://wiki.apache.org/hadoop/EOFException
>     org.apache.drill.exec.work.foreman.Foreman.run():262
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1145
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():615
>     java.lang.Thread.run():745
>   Caused By (org.apache.drill.common.exceptions.DrillRuntimeException) Failed 
> to create schema tree: End of File Exception between local host is: 
> "host1.com/10.95.112.80"; destination host is: "host2.com":38234; : 
> java.io.EOFException; For more details see:  
> http://wiki.apache.org/hadoop/EOFException
>     org.apache.drill.exec.ops.QueryContext.getRootSchema():169
>     org.apache.drill.exec.ops.QueryContext.getRootSchema():151
>     org.apache.drill.exec.ops.QueryContext.getRootSchema():139
>     org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema():125
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():59
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():927
>     org.apache.drill.exec.work.foreman.Foreman.run():251
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1145
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():615
>     java.lang.Thread.run():745
>   Caused By (java.io.EOFException) End of File Exception between local host 
> is: "host1.com/10.95.112.80"; destination host is: "host2.com":38234; : 
> java.io.EOFException; For more details see:  
> http://wiki.apache.org/hadoop/EOFException
>     sun.reflect.NativeConstructorAccessorImpl.newInstance0():-2
>     sun.reflect.NativeConstructorAccessorImpl.newInstance():57
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance():45
>     java.lang.reflect.Constructor.newInstance():526
>     org.apache.hadoop.net.NetUtils.wrapWithMessage():792
>     org.apache.hadoop.net.NetUtils.wrapException():765
>     org.apache.hadoop.ipc.Client.call():1480
>     org.apache.hadoop.ipc.Client.call():1407
>     org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke():229
>     com.sun.proxy.$Proxy76.getListing():-1
>     
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing():573
>     sun.reflect.NativeMethodAccessorImpl.invoke0():-2
>     sun.reflect.NativeMethodAccessorImpl.invoke():57
>     sun.reflect.DelegatingMethodAccessorImpl.invoke():43
>     java.lang.reflect.Method.invoke():606
>     org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod():187
>     org.apache.hadoop.io.retry.RetryInvocationHandler.invoke():102
>     com.sun.proxy.$Proxy77.getListing():-1
>     org.apache.hadoop.hdfs.DFSClient.listPaths():2094
>     org.apache.hadoop.hdfs.DFSClient.listPaths():2077
>     org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal():791
>     org.apache.hadoop.hdfs.DistributedFileSystem.access$700():106
>     org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall():853
>     org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall():849
>     org.apache.hadoop.fs.FileSystemLinkResolver.resolve():81
>     org.apache.hadoop.hdfs.DistributedFileSystem.listStatus():860
>     org.apache.drill.exec.store.dfs.DrillFileSystem.listStatus():523
>     org.apache.drill.exec.store.dfs.WorkspaceSchemaFactory.accessible():157
>     
> org.apache.drill.exec.store.dfs.FileSystemSchemaFactory$FileSystemSchema.():78
>     
> org.apache.drill.exec.store.dfs.FileSystemSchemaFactory.registerSchemas():65
>     org.apache.drill.exec.store.dfs.FileSystemPlugin.registerSchemas():150
>     
> org.apache.drill.exec.store.StoragePluginRegistryImpl$DrillSchemaFactory.registerSchemas():365
>     org.apache.drill.exec.ops.QueryContext.getRootSchema():162
>     org.apache.drill.exec.ops.QueryContext.getRootSchema():151
>     org.apache.drill.exec.ops.QueryContext.getRootSchema():139
>     org.apache.drill.exec.ops.QueryContext.getNewDefaultSchema():125
>     org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():59
>     org.apache.drill.exec.work.foreman.Foreman.runSQL():927
>     org.apache.drill.exec.work.foreman.Foreman.run():251
>     java.util.concurrent.ThreadPoolExecutor.runWorker():1145
>     java.util.concurrent.ThreadPoolExecutor$Worker.run():615
>     java.lang.Thread.run():745
>   Caused By (java.io.EOFException) null
>     java.io.DataInputStream.readInt():392
>     org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse():1079
>     org.apache.hadoop.ipc.Client$Connection.run():974
>
>
> Regards

Reply via email to