Is copying flink-hadoop-compatibility jar to FLINK_HOME/lib the only way to make it work?

2019-04-13 Thread morven huang
Hi, I’m using Flink 1.5.6 and Hadoop 2.7.1. My requirement is to read hdfs sequence file (SequenceFileInputFormat), then write it back to hdfs (SequenceFileAsBinaryOutputFormat with compression). Below code won’t work until I copy the flink-hadoop-compatibility jar to FLINK_HOME/lib. I find

Re: Is copying flink-hadoop-compatibility jar to FLINK_HOME/lib the only way to make it work?

2019-04-10 Thread Morven Huang
Hi Fabian, Packaging that dependency into a fat jar doesn't help, here is the pom.xml I use, could you please help to take a look if there're some problems? http://maven.apache.org/POM/4.0.0"; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; xsi:schemaLocation="http://maven.apache.org/POM/4

Re: Is copying flink-hadoop-compatibility jar to FLINK_HOME/lib the only way to make it work?

2019-04-10 Thread Fabian Hueske
Hi, Packaging the flink-hadoop-compatibility dependency with your code into a "fat" job jar should work as well. Best, Fabian Am Mi., 10. Apr. 2019 um 15:08 Uhr schrieb Morven Huang < morven.hu...@gmail.com>: > Hi, > > > > I’m using Flink 1.5.6 and Hadoop 2.7.1. > > > > *My requirement is to re

Is copying flink-hadoop-compatibility jar to FLINK_HOME/lib the only way to make it work?

2019-04-10 Thread Morven Huang
Hi, I’m using Flink 1.5.6 and Hadoop 2.7.1. *My requirement is to read hdfs sequence file (SequenceFileInputFormat), then write it back to hdfs (SequenceFileAsBinaryOutputFormat with compression).* Below code won’t work until I copy the flink-hadoop-compatibility jar to FLINK_HOME/lib. I f