Hi Vijay,
have you tried yarn-ship-files [1] or yarn-ship-archives [2]? Maybe, that's
what you're looking for...

Best,
Matthias

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/deployment/config/#yarn-ship-files
[2]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/deployment/config/#yarn-ship-archives

On Tue, May 25, 2021 at 5:56 PM Vijayendra Yadav <contact....@gmail.com>
wrote:

> Hi Piotr,
>
> I have been doing the same process as you mentioned so far, now I am
> migrating the deployment process using AWS CDK and AWS Step Functions, kind
> of like the CICD process.
> I added a download step of jar and configs (1, 2, 3 and 4) from S3 using
> command-runner.jar (AWS Step); it loaded that into one of the Master nodes
> (out of 3). In the next step when I launched Flink Job it would not find
> build because Job is launched in some other yarn node.
>
> I was hoping just like *Apache spark *where whatever files we provide in
> *--file*s are shipped to yarn (s3 to yarn workfirectory), Flink should
> also have a solution.
>
> Thanks,
> Vijay
>
>
> On Tue, May 25, 2021 at 12:50 AM Piotr Nowojski <pnowoj...@apache.org>
> wrote:
>
>> Hi Vijay,
>>
>> I'm not sure if I understand your question correctly. You have jar and
>> configs (1, 2, 3 and 4) on S3 and you want to start a Flink job using
>> those? Can you simply download those things (whole directory containing
>> those) to the machine that will be starting the Flink job?
>>
>> Best, Piotrek
>>
>> wt., 25 maj 2021 o 07:50 Vijayendra Yadav <contact....@gmail.com>
>> napisaƂ(a):
>>
>>> Hi Team,
>>>
>>> I am trying to find a way to ship files from aws s3 for a flink
>>> streaming job, I am running on AWS EMR. What i need to ship are following:
>>> 1) application jar
>>> 2) application property file
>>> 3) custom flink-conf.yaml
>>> 4) log4j application specific
>>>
>>> Please let me know options.
>>>
>>> Thanks,
>>> Vijay
>>
>>

Reply via email to