Dependending on the build system used, you could check the dependency tree,
e.g. for Maven it would be `mvn dependency:tree
-Dincludes=org.apache.parquet`

Matthias

On Wed, Jun 30, 2021 at 8:40 AM Thomas Wang <w...@datability.io> wrote:

> Thanks Matthias. Could you advise how I can confirm this in my environment?
>
> Thomas
>
> On Tue, Jun 29, 2021 at 1:41 AM Matthias Pohl <matth...@ververica.com>
> wrote:
>
>> Hi Rommel, Hi Thomas,
>> Apache Parquet was bumped from 1.10.0 to 1.11.1 for Flink 1.12 in
>> FLINK-19137 [1]. The error you're seeing looks like some dependency issue
>> where you have a version other than 1.11.1
>> of org.apache.parquet:parquet-column:jar on your classpath?
>>
>> Matthias
>>
>> [1] https://issues.apache.org/jira/browse/FLINK-19137
>>
>> On Wed, Jun 23, 2021 at 1:50 AM Rommel Holmes <rommelhol...@gmail.com>
>> wrote:
>>
>>> To give more information
>>>
>>> parquet-avro version 1.10.0 with Flink 1.11.2 and it was running fine.
>>>
>>> now Flink 1.12.1, the error msg shows up.
>>>
>>> Thank you for help.
>>>
>>> Rommel
>>>
>>>
>>>
>>>
>>>
>>> On Tue, Jun 22, 2021 at 2:41 PM Thomas Wang <w...@datability.io> wrote:
>>>
>>>> Hi,
>>>>
>>>> We recently upgraded our Flink version from 1.11.2 to 1.12.1 and one of
>>>> our jobs that used to run ok, now sees the following error. This error
>>>> doesn't seem to be related to any user code. Can someone help me take a
>>>> look?
>>>>
>>>> Thanks.
>>>>
>>>> Thomas
>>>>
>>>> java.lang.NoSuchMethodError:
>>>> org.apache.parquet.column.ParquetProperties.getColumnIndexTruncateLength()I
>>>> at
>>>> org.apache.parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:282)
>>>> ~[?:?]
>>>> at
>>>> org.apache.parquet.hadoop.ParquetWriter$Builder.build(ParquetWriter.java:564)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.formats.parquet.avro.ParquetAvroWriters.createAvroParquetWriter(ParquetAvroWriters.java:90)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.formats.parquet.avro.ParquetAvroWriters.lambda$forGenericRecord$abd75386$1(ParquetAvroWriters.java:65)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.formats.parquet.ParquetWriterFactory.create(ParquetWriterFactory.java:56)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.BulkBucketWriter.openNew(BulkBucketWriter.java:75)
>>>> ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.OutputStreamBasedPartFileWriter$OutputStreamBasedBucketWriter.openNewInProgressFile(OutputStreamBasedPartFileWriter.java:90)
>>>> ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.BulkBucketWriter.openNewInProgressFile(BulkBucketWriter.java:36)
>>>> ~[flink-table-blink_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.Bucket.rollPartFile(Bucket.java:243)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.Bucket.write(Bucket.java:220)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.Buckets.onElement(Buckets.java:305)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.StreamingFileSinkHelper.onElement(StreamingFileSinkHelper.java:103)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.functions.sink.filesystem.StreamingFileSink.invoke(StreamingFileSink.java:492)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.StreamSink.processElement(StreamSink.java:54)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:71)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:46)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:26)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.BroadcastingOutputCollector.collect(BroadcastingOutputCollector.java:75)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.BroadcastingOutputCollector.collect(BroadcastingOutputCollector.java:32)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:50)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:28)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.StreamMap.processElement(StreamMap.java:38)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:71)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:46)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:26)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:50)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:28)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.operators.TimestampsAndWatermarksOperator.processElement(TimestampsAndWatermarksOperator.java:104)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:71)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:46)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:26)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:50)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:28)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollectWithTimestamp(StreamSourceContexts.java:322)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collectWithTimestamp(StreamSourceContexts.java:426)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:365)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:183)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.runFetchLoop(KafkaFetcher.java:142)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:826)
>>>> ~[?:?]
>>>> at
>>>> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>> at
>>>> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:241)
>>>> ~[flink-dist_2.12-1.12.1.jar:1.12.1]
>>>>
>>>
>>>
>>> --
>>>      Yours
>>>      Rommel
>>> *************************************
>>>   I  waited patiently for the LORD;
>>>    he turned to me and heard my cry.
>>>  He lifted me out of the slimy pit,
>>>    out of the mud and mire;
>>> he set my feet on a rock
>>>    and gave me a firm place to stand.
>>> *************************************
>>>
>>
>>

Reply via email to