Bowen:
The picture didn't come thru.

Can you pastebin the contents of /lib dir ?

Cheers

On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <bowen...@offerupnow.com> wrote:

> Hi Gordon,
>     I double checked that I'm not using any of httpclient/httpcore
> or aws-java-sdk-xxx jars in my application.
>
>     The only thing I did with aws-java-sdk is to put
> aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/
> projects/flink/flink-docs-release-1.3/setup/aws.html#flink-for-hadoop-27.
> Here's the screenshot of my /lib dir.
>    [image: Inline image 1]
>
>     Can the root cause be that shaded aws-java-sdk in flink is different
> than shaded aws-java-sdk in flink-kinesis-connector?
>
> Thanks!
>
> On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai <tzuli...@apache.org
> > wrote:
>
>> Hi Bowen,
>>
>> Thanks for the info. I checked the 1.3.0 release jars, and they do not
>> have unshaded httpcomponent dependencies, so that shouldn’t be the problem.
>>
>> Looking back into the stack trace you posted, the conflict seems to be a
>> different problem.
>> The conflict seems to be with clashes with the aws-java-sdk version, and
>> not the httpcomponent dependency.
>> The “INSTANCE” field actually does exist in the aws-java-sdk version that
>> the Kinesis connector is using.
>>
>> Could it be that you have other conflicting aws-java-sdk versions in your
>> jar?
>>
>> Cheers,
>> Gordon
>>
>> On 20 June 2017 at 12:55:17 PM, Bowen Li (bowen...@offerupnow.com) wrote:
>>
>> Hi Gordon,
>> Here's what I use:
>>
>> - Flink: I didn't build Flink myself. I download
>> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/
>> flink-1.3.0-bin-hadoop27-scala_2.11.tgz
>> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11)
>> - flink-kinesis-connector: I
>> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code
>> downloaded at *#Source* section in
>> https://flink.apache.org/downloads.html.
>> - Mvn -v: Apache Maven 3.2.5
>>
>>
>> In short, I didn't build Flink. Most likely that dependencies in
>> either flink-dist or flink-kinesis-connector is not shaded properly?
>>
>> Thanks!
>> Bowen
>>
>> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai <tzuli...@apache.org
>> >
>> wrote:
>>
>> > Hi,
>> >
>> > We’ve seen this issue before [1]. The usual reason is that the
>> > httpcomponent dependencies weren’t properly shaded in the flink-dist
>> jar.
>> > Having them properly shaded should solve the issue.
>> >
>> > cc Bowen:
>> > Are you building Flink yourself? If yes, what Maven version are you
>> using?
>> > If you’re using 3.3.x+, after the first build under flink/, make sure
>> to go
>> > to flink-dist/ and build a second time for the dependencies to be
>> properly
>> > shaded.
>> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has
>> > dependency shading issues.
>> >
>> > If you’re not building Flink yourself, the cause could be that the Flink
>> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check.
>> >
>> > Best,
>> > Gordon
>> >
>> > [1] https://issues.apache.org/jira/browse/FLINK-5013
>> >
>> > On 20 June 2017 at 12:14:27 PM, Ted Yu (yuzhih...@gmail.com) wrote:
>> >
>> > I logged FLINK-6951, referencing this thread.
>> >
>> > We can continue discussion there.
>> >
>> > Thanks
>> >
>> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <bowen...@offerupnow.com>
>> wrote:
>> >
>> > > Thanks, Ted! woo, this is unexpected. https://ci.apache.
>> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of
>> date.
>> > >
>> > > I bet anyone using Kinesis with Flink will run into this issue. I can
>> try
>> > > to build Flink myself and resolve this problem. But talking about a
>> > > feasible permanent solution for all flink-connector-kinesis users.
>> Shall
>> > we
>> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or
>> > shall
>> > > we upgrade httpcomponents version in Flink?
>> > >
>> > > Bowen
>> > >
>> > >
>> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>> > >
>> > > > Here is the dependency in the flink-connector-kinesis module:
>> > > >
>> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile
>> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile
>> > > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile
>> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile
>> > > >
>> > > > Checking dependency tree of flink, the highest version is 4.2.x
>> > > >
>> > > > You can try building flink with dependency on 4.3.y of httpclient /
>> > > > httpcore
>> > > >
>> > > > FYI
>> > > >
>> > > >
>> > > >
>> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <bowen...@offerupnow.com>
>> > > wrote:
>> > > >
>> > > > > Hi guys,
>> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got
>> the
>> > > > > following Apache http jar compatibility error, and cannot figure
>> out
>> > > how
>> > > > to
>> > > > > resolve it.
>> > > > >
>> > > > > Here's the stacktrace:
>> > > > >
>> > > > > ```
>> > > > >
>> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime.
>> > > taskmanager.Task
>> > > > > - Source: Custom Source -> (Timestamps/Watermarks,
>> > > Filter
>> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1)
>> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to
>> FAILED.
>> > > > > java.lang.NoSuchFieldError: INSTANCE
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn.
>> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration(
>> > > > > SdkConnectionKeepAliveStrategy.java:48)
>> > > > > at
>> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute(
>> > > > > DefaultRequestDirector.java:535)
>> > > > > at
>> > > > > org.apache.http.impl.client.AbstractHttpClient.execute(
>> > > > > AbstractHttpClient.java:906)
>> > > > > at
>> > > > > org.apache.http.impl.client.AbstractHttpClient.execute(
>> > > > > AbstractHttpClient.java:805)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt
>> pClient.
>> > > > > executeOneRequest(AmazonHttpClient.java:837)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt
>> pClient.
>> > > > > executeHelper(AmazonHttpClient.java:607)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.
>> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt
>> pClient.
>> > > > > executeWithTimer(AmazonHttpClient.java:338)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.
>> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services.
>> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav
>> a:1940)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services.
>> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910)
>> > > > > at
>> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services.
>> > > > > kinesis.AmazonKinesisClient.describeStream(
>> > > AmazonKinesisClient.java:656)
>> > > > > at
>> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.
>> > > > > KinesisProxy.describeStream(KinesisProxy.java:361)
>> > > > > at
>> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.
>> > > > > getShardsOfStream(KinesisProxy.java:323)
>> > > > > at
>> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.
>> > > > > KinesisProxy.getShardList(KinesisProxy.java:231)
>> > > > > at
>> > > > > org.apache.flink.streaming.connectors.kinesis.internals.
>> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe(
>> > > > > KinesisDataFetcher.java:430)
>> > > > > at
>> > > > > org.apache.flink.streaming.connectors.kinesis.
>> > > FlinkKinesisConsumer.run(
>> > > > > FlinkKinesisConsumer.java:202)
>> > > > > at
>> > > > > org.apache.flink.streaming.api.operators.StreamSource.
>> > > > > run(StreamSource.java:87)
>> > > > > at
>> > > > > org.apache.flink.streaming.api.operators.StreamSource.
>> > > > > run(StreamSource.java:55)
>> > > > > at
>> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(
>> > > > > SourceStreamTask.java:95)
>> > > > > at
>> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask.
>> > > > > invoke(StreamTask.java:262)
>> > > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702)
>> > > > > at java.lang.Thread.run(Thread.java:745)
>> > > > >
>> > > > > ```
>> > > > >
>> > > > > Here's my Flink environment setup:
>> > > > >
>> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built
>> > > from
>> > > > > 1.3.0 source code.
>> > > > > - I followed
>> > > > > https://ci.apache.org/projects/flink/flink-docs-
>> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27
>> > > > > and added all necessary dependency jars
>> > > > > - My application doesn't use Apache http/core.
>> > > > >
>> > > > >
>> > > > > Has anyone experienced the similar incompatibility issue?
>> > > > >
>> > > > > Thanks!
>> > > > > Bowen
>> > > > >
>> > > >
>> > >
>> >
>>
>
>

Reply via email to