[jira] [Commented] (FLINK-12522) Error While Initializing S3A FileSystem

2019-05-15 Thread Ken Krugler (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-12522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16840806#comment-16840806
 ] 

Ken Krugler commented on FLINK-12522:
-

Hi Manish Bellani - please post to the Flink user list first, as that's where 
help is provided. Then if one of the community members identifies it as a bug, 
go and open a Jira issue. Thanks, Ken

> Error While Initializing S3A FileSystem
> ---
>
> Key: FLINK-12522
> URL: https://issues.apache.org/jira/browse/FLINK-12522
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / FileSystem
>Affects Versions: 1.7.2
> Environment: Kubernetes
> Docker
> Debian
> {color:#33}DockerImage: flink:1.7.2-hadoop28-scala_2.11{color}
> {color:#33}Java 1.8{color}
> {color:#33}Hadoop Version: 2.8.5{color}
>Reporter: Manish Bellani
>Priority: Major
>
> hey Friends,
> Thanks for all the work you have been doing on flink, i have been trying to 
> use BucketingSink (backed by S3AFileSystem) to write data to s3 and i'm 
> running into some issues (which i suspect could be dependency/packaging 
> related) that'd try to describe here.
> The data pipeline is quite simple:
> {noformat}
> Kafka -> KafkaConsumer (Source) -> BucketingSink (S3AFileSystem) -> 
> S3{noformat}
> *Environment:* 
> {noformat}
> Kubernetes
> Debian
> DockerImage: flink:1.7.2-hadoop28-scala_2.11
> Java 1.8
> Hadoop Version: 2.8.5{noformat}
> I followed this dependency secion: 
> [https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/aws.html#flink-for-hadoop-27]
>  to place the dependencies under /opt/flink/lib (with an exception that my 
> hadoop version and it's dependencies that i pull in are different).
> Here are the dependencies i'm pulling in (excerpt from my Dockerfile) 
> {noformat}
> RUN cp /opt/flink/opt/flink-s3-fs-hadoop-1.7.2.jar 
> /opt/flink/lib/flink-s3-fs-hadoop-1.7.2.jar
> RUN wget -O /opt/flink/lib/hadoop-aws-2.8.5.jar 
> https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar
> RUN wget -O /opt/flink/lib/aws-java-sdk-s3-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-s3/1.10.6/aws-java-sdk-s3-1.10.6.jar
> #Transitive Dependency of aws-java-sdk-s3
> RUN wget -O /opt/flink/lib/aws-java-sdk-core-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-core/1.10.6/aws-java-sdk-core-1.10.6.jar
> RUN wget -O /opt/flink/lib/aws-java-sdk-kms-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-kms/1.10.6/aws-java-sdk-kms-1.10.6.jar
> RUN wget -O /opt/flink/lib/jackson-annotations-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-annotations/2.5.3/jackson-annotations-2.5.3.jar
> RUN wget -O /opt/flink/lib/jackson-core-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-core/2.5.3/jackson-core-2.5.3.jar
> RUN wget -O /opt/flink/lib/jackson-databind-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-databind/2.5.3/jackson-databind-2.5.3.jar
> RUN wget -O /opt/flink/lib/joda-time-2.8.1.jar 
> http://central.maven.org/maven2/joda-time/joda-time/2.8.1/joda-time-2.8.1.jar
> RUN wget -O /opt/flink/lib/httpcore-4.3.3.jar 
> http://central.maven.org/maven2/org/apache/httpcomponents/httpcore/4.3.3/httpcore-4.3.3.jar
> RUN wget -O /opt/flink/lib/httpclient-4.3.6.jar 
> http://central.maven.org/maven2/org/apache/httpcomponents/httpclient/4.3.6/httpclient-4.3.6.jar{noformat}
>  
> But when i submit the job, it throws this error during initialzation of 
> BucketingSink/S3AFileSystem: 
> {noformat}
> java.beans.IntrospectionException: bad write method arg count: public final 
> void 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
>     at 
> java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:657)
>     at 
> java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:327)
>     at java.beans.PropertyDescriptor.(PropertyDescriptor.java:139)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.getPro

[jira] [Commented] (FLINK-12522) Error While Initializing S3A FileSystem

2019-05-15 Thread Manish Bellani (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-12522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16840816#comment-16840816
 ] 

Manish Bellani commented on FLINK-12522:


Thanks Ken, will do.

> Error While Initializing S3A FileSystem
> ---
>
> Key: FLINK-12522
> URL: https://issues.apache.org/jira/browse/FLINK-12522
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / FileSystem
>Affects Versions: 1.7.2
> Environment: Kubernetes
> Docker
> Debian
> {color:#33}DockerImage: flink:1.7.2-hadoop28-scala_2.11{color}
> {color:#33}Java 1.8{color}
> {color:#33}Hadoop Version: 2.8.5{color}
>Reporter: Manish Bellani
>Priority: Major
>
> hey Friends,
> Thanks for all the work you have been doing on flink, i have been trying to 
> use BucketingSink (backed by S3AFileSystem) to write data to s3 and i'm 
> running into some issues (which i suspect could be dependency/packaging 
> related) that'd try to describe here.
> The data pipeline is quite simple:
> {noformat}
> Kafka -> KafkaConsumer (Source) -> BucketingSink (S3AFileSystem) -> 
> S3{noformat}
> *Environment:* 
> {noformat}
> Kubernetes
> Debian
> DockerImage: flink:1.7.2-hadoop28-scala_2.11
> Java 1.8
> Hadoop Version: 2.8.5{noformat}
> I followed this dependency secion: 
> [https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/aws.html#flink-for-hadoop-27]
>  to place the dependencies under /opt/flink/lib (with an exception that my 
> hadoop version and it's dependencies that i pull in are different).
> Here are the dependencies i'm pulling in (excerpt from my Dockerfile) 
> {noformat}
> RUN cp /opt/flink/opt/flink-s3-fs-hadoop-1.7.2.jar 
> /opt/flink/lib/flink-s3-fs-hadoop-1.7.2.jar
> RUN wget -O /opt/flink/lib/hadoop-aws-2.8.5.jar 
> https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar
> RUN wget -O /opt/flink/lib/aws-java-sdk-s3-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-s3/1.10.6/aws-java-sdk-s3-1.10.6.jar
> #Transitive Dependency of aws-java-sdk-s3
> RUN wget -O /opt/flink/lib/aws-java-sdk-core-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-core/1.10.6/aws-java-sdk-core-1.10.6.jar
> RUN wget -O /opt/flink/lib/aws-java-sdk-kms-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-kms/1.10.6/aws-java-sdk-kms-1.10.6.jar
> RUN wget -O /opt/flink/lib/jackson-annotations-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-annotations/2.5.3/jackson-annotations-2.5.3.jar
> RUN wget -O /opt/flink/lib/jackson-core-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-core/2.5.3/jackson-core-2.5.3.jar
> RUN wget -O /opt/flink/lib/jackson-databind-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-databind/2.5.3/jackson-databind-2.5.3.jar
> RUN wget -O /opt/flink/lib/joda-time-2.8.1.jar 
> http://central.maven.org/maven2/joda-time/joda-time/2.8.1/joda-time-2.8.1.jar
> RUN wget -O /opt/flink/lib/httpcore-4.3.3.jar 
> http://central.maven.org/maven2/org/apache/httpcomponents/httpcore/4.3.3/httpcore-4.3.3.jar
> RUN wget -O /opt/flink/lib/httpclient-4.3.6.jar 
> http://central.maven.org/maven2/org/apache/httpcomponents/httpclient/4.3.6/httpclient-4.3.6.jar{noformat}
>  
> But when i submit the job, it throws this error during initialzation of 
> BucketingSink/S3AFileSystem: 
> {noformat}
> java.beans.IntrospectionException: bad write method arg count: public final 
> void 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
>     at 
> java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:657)
>     at 
> java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:327)
>     at java.beans.PropertyDescriptor.(PropertyDescriptor.java:139)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.ja

[jira] [Commented] (FLINK-12522) Error While Initializing S3A FileSystem

2019-05-16 Thread Manish Bellani (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-12522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16841321#comment-16841321
 ] 

Manish Bellani commented on FLINK-12522:


This turned out to be a non-issue but definitely could be misleading, more 
context here: 
[http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Error-While-Initializing-S3A-FileSystem-td27878.html.]
 Going to close this issue

> Error While Initializing S3A FileSystem
> ---
>
> Key: FLINK-12522
> URL: https://issues.apache.org/jira/browse/FLINK-12522
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / FileSystem
>Affects Versions: 1.7.2
> Environment: Kubernetes
> Docker
> Debian
> {color:#33}DockerImage: flink:1.7.2-hadoop28-scala_2.11{color}
> {color:#33}Java 1.8{color}
> {color:#33}Hadoop Version: 2.8.5{color}
>Reporter: Manish Bellani
>Priority: Major
>
> hey Friends,
> Thanks for all the work you have been doing on flink, i have been trying to 
> use BucketingSink (backed by S3AFileSystem) to write data to s3 and i'm 
> running into some issues (which i suspect could be dependency/packaging 
> related) that'd try to describe here.
> The data pipeline is quite simple:
> {noformat}
> Kafka -> KafkaConsumer (Source) -> BucketingSink (S3AFileSystem) -> 
> S3{noformat}
> *Environment:* 
> {noformat}
> Kubernetes
> Debian
> DockerImage: flink:1.7.2-hadoop28-scala_2.11
> Java 1.8
> Hadoop Version: 2.8.5{noformat}
> I followed this dependency secion: 
> [https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/aws.html#flink-for-hadoop-27]
>  to place the dependencies under /opt/flink/lib (with an exception that my 
> hadoop version and it's dependencies that i pull in are different).
> Here are the dependencies i'm pulling in (excerpt from my Dockerfile) 
> {noformat}
> RUN cp /opt/flink/opt/flink-s3-fs-hadoop-1.7.2.jar 
> /opt/flink/lib/flink-s3-fs-hadoop-1.7.2.jar
> RUN wget -O /opt/flink/lib/hadoop-aws-2.8.5.jar 
> https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.8.5/hadoop-aws-2.8.5.jar
> RUN wget -O /opt/flink/lib/aws-java-sdk-s3-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-s3/1.10.6/aws-java-sdk-s3-1.10.6.jar
> #Transitive Dependency of aws-java-sdk-s3
> RUN wget -O /opt/flink/lib/aws-java-sdk-core-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-core/1.10.6/aws-java-sdk-core-1.10.6.jar
> RUN wget -O /opt/flink/lib/aws-java-sdk-kms-1.10.6.jar 
> http://central.maven.org/maven2/com/amazonaws/aws-java-sdk-kms/1.10.6/aws-java-sdk-kms-1.10.6.jar
> RUN wget -O /opt/flink/lib/jackson-annotations-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-annotations/2.5.3/jackson-annotations-2.5.3.jar
> RUN wget -O /opt/flink/lib/jackson-core-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-core/2.5.3/jackson-core-2.5.3.jar
> RUN wget -O /opt/flink/lib/jackson-databind-2.5.3.jar 
> http://central.maven.org/maven2/com/fasterxml/jackson/core/jackson-databind/2.5.3/jackson-databind-2.5.3.jar
> RUN wget -O /opt/flink/lib/joda-time-2.8.1.jar 
> http://central.maven.org/maven2/joda-time/joda-time/2.8.1/joda-time-2.8.1.jar
> RUN wget -O /opt/flink/lib/httpcore-4.3.3.jar 
> http://central.maven.org/maven2/org/apache/httpcomponents/httpcore/4.3.3/httpcore-4.3.3.jar
> RUN wget -O /opt/flink/lib/httpclient-4.3.6.jar 
> http://central.maven.org/maven2/org/apache/httpcomponents/httpclient/4.3.6/httpclient-4.3.6.jar{noformat}
>  
> But when i submit the job, it throws this error during initialzation of 
> BucketingSink/S3AFileSystem: 
> {noformat}
> java.beans.IntrospectionException: bad write method arg count: public final 
> void 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
>     at 
> java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:657)
>     at 
> java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:327)
>     at java.beans.PropertyDescriptor.(PropertyDescriptor.java:139)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
>     at 
> org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
>     at 
> org.apache.flink.fs.shaded.hadoop3.or