L. C. Hsieh created SPARK-36681:
-----------------------------------

             Summary: Fail to load Snappy codec
                 Key: SPARK-36681
                 URL: https://issues.apache.org/jira/browse/SPARK-36681
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.2.0
            Reporter: L. C. Hsieh


snappy-java as a native library should not be relocated in Hadoop shaded client 
libraries. Currently we use Hadoop shaded client libraries in Spark. If trying 
to use SnappyCodec to write sequence file, we will encounter the following 
error:

{code}
[info]   Cause: java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.shaded.org.xerial.snappy.SnappyNative.rawCompress(Ljava/nio/ByteBuffer;IILjava/nio/ByteBuffer;I)I
[info]   at 
org.apache.hadoop.shaded.org.xerial.snappy.SnappyNative.rawCompress(Native 
Method)                                                                         
                        
[info]   at 
org.apache.hadoop.shaded.org.xerial.snappy.Snappy.compress(Snappy.java:151)     
                                                                                
                   
[info]   at 
org.apache.hadoop.io.compress.snappy.SnappyCompressor.compressDirectBuf(SnappyCompressor.java:282)
[info]   at 
org.apache.hadoop.io.compress.snappy.SnappyCompressor.compress(SnappyCompressor.java:210)
[info]   at 
org.apache.hadoop.io.compress.BlockCompressorStream.compress(BlockCompressorStream.java:149)
[info]   at 
org.apache.hadoop.io.compress.BlockCompressorStream.finish(BlockCompressorStream.java:142)
[info]   at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.writeBuffer(SequenceFile.java:1589)
 
[info]   at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.sync(SequenceFile.java:1605)
[info]   at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.close(SequenceFile.java:1629)
 
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to