Thank You Xintong, I will look for these updates in the near future.
Regards,
Vijay
On Wed, May 26, 2021 at 6:40 PM Xintong Song wrote:
> Hi Vijay,
>
> Currently, Flink only supports shipping files from the local machine where
> job is submitted.
>
> There are tickets [1][2][3] tracking the
Hi Vijay,
Currently, Flink only supports shipping files from the local machine where
job is submitted.
There are tickets [1][2][3] tracking the efforts that shipping files from
remote paths, e.g., http, hdfs, etc. Once the efforts are done, adding s3
as an additional supported schema should be
Hi Pohl,
I tried to ship my property file. Example: *-yarn.ship-files
s3://applib/xx/xx/1.0-SNAPSHOT/application.properties \*
*Error:*
6:21:37.163 [main] ERROR org.apache.flink.client.cli.CliFrontend - Invalid
command line arguments.
org.apache.flink.client.cli.CliArgsException: Could not
Hi Vijay,
have you tried yarn-ship-files [1] or yarn-ship-archives [2]? Maybe, that's
what you're looking for...
Best,
Matthias
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/deployment/config/#yarn-ship-files
[2]
Hi Piotr,
I have been doing the same process as you mentioned so far, now I am
migrating the deployment process using AWS CDK and AWS Step Functions, kind
of like the CICD process.
I added a download step of jar and configs (1, 2, 3 and 4) from S3 using
command-runner.jar (AWS Step); it loaded
Hi Vijay,
I'm not sure if I understand your question correctly. You have jar and
configs (1, 2, 3 and 4) on S3 and you want to start a Flink job using
those? Can you simply download those things (whole directory containing
those) to the machine that will be starting the Flink job?
Best, Piotrek