Not sure if this helps - this is how I invoke a Sagemaker endpoint model
from a flink pipeline.
See
https://github.com/smarthi/NMT-Sagemaker-Inference/blob/master/src/main/java/de/dws/berlin/util/AwsUtil.java
On Mon, Feb 24, 2020 at 10:08 AM David Magalhães
wrote:
> Hi Robert, thanks for your
check this.
https://github.com/kali786516/FlinkStreamAndSql/blob/b8bcbadaa3cb6bfdae891f10ad1205e256adbc1e/src/main/scala/com/aws/examples/dynamodb/dynStreams/FlinkDynamoDBStreams.scala#L42
https://github.com/kali786516/FlinkStreamAndSql/blob/b8bcbadaa3cb6bfdae891f10ad1205e256adbc1e/src/main/scala
Hi Robert, thanks for your reply.
GlobalConfiguration.loadConfiguration was useful to check if a
flink-conf.yml file was on resources, for the integration tests that I'm
doing. On the cluster I will use the default configurations.
On Fri, Feb 21, 2020 at 10:58 AM Robert Metzger wrote:
> There a
There are multiple ways of passing configuration parameters to your user
defined code in Flink
a) use getRuntimeContext().getUserCodeClassLoader().getResource() to load
a config file from your user code jar or the classpath.
b) use getRuntimeContext().getExecutionConfig().getGlobalJobParameters(
First things first, we do not intend for users to use anything in the S3
filesystem modules except the filesystems itself,
meaning that you're somewhat treading on unsupported ground here.
Nevertheless, the S3 modules contain a large variety of AWS-provided
CerentialsProvider implementations,
t
I'm using
org.apache.flink.fs.s3base.shaded.com.amazonaws.client.builder.AwsClientBuilder
to create a S3 client to copy objects and delete object inside
a TwoPhaseCommitSinkFunction.
Shouldn't be another way to set up configurations without put them
hardcoded ? Something like core-site.xml or flin