Hi,
I am trying to use S3 as an artifact dir, so that I don't need to worry
about the shared volume in the flink job and task manager.

Currently I am passing the --artifacts_dir as the pipeline options, for
example:

```
ARGS = [
    "--streaming",
    "--runner=portableRunner",
    "--environment_type=PROCESS",
    '--environment_config={"command":"/opt/apache/beam/boot"}',
    '--artifacts_dir=s3a://my-bucket/artifacts',
]
with beam.Pipeline(options=PipelineOptions(pipeline_args)) as pipeline:
    # running my pipeline.
```
However, the job is able to run, but I couldn't find the artifacts in my s3
bucket, and looks like the artifacts are still stored in the default
`/tmp/beam-artifact-staging` . Wondering how I can configure to use the s3
as artifact directory?  Thanks!



Sincerely,
Lydian Lee

Reply via email to