[ 
https://issues.apache.org/jira/browse/BEAM-11231?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17548981#comment-17548981
 ] 

Danny McCormick commented on BEAM-11231:
----------------------------------------

This issue has been migrated to https://github.com/apache/beam/issues/20591

> Better support for Hadoop native libs and compression codecs in Java Dataflow 
> batch worker
> ------------------------------------------------------------------------------------------
>
>                 Key: BEAM-11231
>                 URL: https://issues.apache.org/jira/browse/BEAM-11231
>             Project: Beam
>          Issue Type: Improvement
>          Components: io-java-gcp, io-java-hadoop-file-system, 
> io-java-hadoop-format, io-java-hbase, runner-dataflow
>    Affects Versions: 2.24.0
>            Reporter: Cheng Li
>            Priority: P3
>              Labels: GCP
>
> Current(as of Beam SDK 2.24.0) Java Dataflow workers does not ship with 
> [Hadoop native 
> library|https://hadoop.apache.org/docs/r2.8.5/hadoop-project-dist/hadoop-common/NativeLibraries.html]
>  and popular native compression codecs(e.g libsnappy) .
> If one try to read files using IO classes that assume native library 
> support(e.g, org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormat) on 
> GCP Dataflow, the following error will be throw if the file is compressed by 
> snappy:
>  
> {code}
> java.io.IOException: Failed to start reading from source: 
> org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFor
>  matBoundedSource@311fa603
> ...
> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
> java.lang.RuntimeException: native snappy library not available: this version 
> of libhadoop was built without snappy support.
>   
> {code}
> There is very little one can do from within pipeline job code unless 
> rewriting the IO class, so it might make sense to include these native 
> libraries for Hadoop IOs



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to