Hi Tao,

"Limited spark options”, that you mentioned, are Beam's application arguments 
and if you run your job via "spark-submit" you should still be able to 
configure Spark application via normal spark-submit “--conf key=value” CLI 
option. 
Doesn’t it work for you?

—
Alexey

> On 10 Jun 2021, at 01:29, Tao Li <t...@zillow.com> wrote:
> 
> Hi Beam community,
>  
> We are trying to specify a spark config 
> “spark.hadoop.fs.s3a.canned.acl=BucketOwnerFullControl” in the spark-submit 
> command for a beam app. I only see limited spark options supported according 
> to this doc: https://beam.apache.org/documentation/runners/spark/ 
> <https://beam.apache.org/documentation/runners/spark/>
>  
> How can we specify an arbitrary spark config? Please advise. Thanks!

Reply via email to