Hi all,

I'm attempting to read a nested JSON file compressed with Snappy using the 
following query:

select * from hdfs.`/tmp/test/part-r-01195.json.snappy` limit 20

However I am encountering the following error:

2016-06-17 09:36:29,385 [USER-rpc-event-queue] ERROR 
o.a.d.exec.server.rest.QueryWrapper - Query Failed
org.apache.drill.common.exceptions.UserRemoteException: DATA_READ ERROR: 
Failure reading JSON file - native snappy library not available: this version 
of libhadoop was built without snappy support.

File  /tmp/test/part-r-01195.json.snappy
Record  1
Fragment 0:0

[Error Id: 4d50001a-4449-4614-b647-686f867ff85e on 
MSR-HDP-DN0300.redmond.corp.microsoft.com:31010]
        at 
org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:119)
 [drill-java-exec-1.6.0.jar:1.6.0]
        at 
org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:113) 
[drill-java-exec-1.6.0.jar:1.6.0]
        at 
org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:46)
 [drill-rpc-1.6.0.jar:1.6.0]
        at 
org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:31)
 [drill-rpc-1.6.0.jar:1.6.0]
        at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:67) 
[drill-rpc-1.6.0.jar:1.6.0]
        at org.apache.drill.exec.rpc.RpcBus$RequestEvent.run(RpcBus.java:374) 
[drill-rpc-1.6.0.jar:1.6.0]
        at 
org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89)
 [drill-rpc-1.6.0.jar:1.6.0]
        at 
org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:252) 
[drill-rpc-1.6.0.jar:1.6.0]
        at 
org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) 
[drill-rpc-1.6.0.jar:1.6.0]
        at 
org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:285) 
[drill-rpc-1.6.0.jar:1.6.0]
        at 
org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:257) 
[drill-rpc-1.6.0.jar:1.6.0]
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
 [netty-codec-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
 [netty-handler-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
 [netty-codec-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242)
 [netty-codec-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847)
 [netty-transport-4.0.27.Final.jar:4.0.27.Final]
        at 
io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:618)
 [netty-transport-native-epoll-4.0.27.Final-linux-x86_64.jar:na]
        at 
io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:329) 
[netty-transport-native-epoll-4.0.27.Final-linux-x86_64.jar:na]
        at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:250) 
[netty-transport-native-epoll-4.0.27.Final-linux-x86_64.jar:na]
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
 [netty-common-4.0.27.Final.jar:4.0.27.Final]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]

Our Hadoop cluster supports Snappy compression, I can read Snappy files using 
the `hadoop fs -text` command, and `hadoop checknative -a` shows that Snappy is 
configured:

16/06/17 09:49:38 INFO bzip2.Bzip2Factory: Successfully loaded & initialized 
native-bzip2 library system-native
16/06/17 09:49:38 INFO zlib.ZlibFactory: Successfully loaded & initialized 
native-zlib library
Native library checking:
hadoop:  true /usr/hdp/2.3.4.0-3485/hadoop/lib/native/libhadoop.so.1.0.0
zlib:    true /lib/x86_64-linux-gnu/libz.so.1
snappy:  true /usr/hdp/2.3.4.0-3485/hadoop/lib/native/libsnappy.so.1
lz4:     true revision:99
bzip2:   true /lib/x86_64-linux-gnu/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared 
object file: No such file or directory)!

I really appreciate any suggestions you can give.

Thanks,
Ben

Reply via email to