Control who can submit Beam jobs

2023-11-30 Thread Поротиков Станислав Вячеславович via user
Hello!
Is there any way to control who can submit jobs to Flink cluster. We have 
multiple teams and I am looking for decision how can we use Beam+Flink safely.

Best regards,
Stanislav Porotikov



Re: Control who can submit Beam jobs

2023-11-30 Thread Alexey Romanenko
No since Beam is not a runtime. In the end, it will create a Flink job and run 
it on a Flink cluster. So, it should be a responsibility of your Flink cluster.

—
Alexey

> On 30 Nov 2023, at 10:14, Поротиков Станислав Вячеславович via user 
>  wrote:
> 
> Hello!
> Is there any way to control who can submit jobs to Flink cluster. We have 
> multiple teams and I am looking for decision how can we use Beam+Flink safely.
>  
> Best regards,
> Stanislav Porotikov



[Question] SnowflakeIO Connector & S3 Bucket

2023-11-30 Thread Xinmin
Hello,

We are in the process of evaluating the combination of SnowflakeIO and S3
bucket. As per the Snowflake document below:

https://docs.snowflake.com/en/user-guide/data-load-s3-config

There are three options to configure the access to S3 bucket: 1) Storage
integration; 2) IAM role; 3) IAM user. The second option is deprecated, so
we ruled it out.

As per the SnowflakeIO document below:

https://beam.apache.org/documentation/io/built-in/snowflake/#using-snowflakeio-with-aws-s3

Both access key and secret key are mentioned in the code snippet, it means
the option of IAM user is used, so we can rule the first option out as the
purpose of the storage integration is to avoid the need to supply the
security credentials. Actually, I also did the test with the storage
integration and failed, the error said both keys need to be provided.

Can you please confirm that only the option of IAM user is supported by
SnowflakeIO connector so far? If yes, is there any plan to support it?
Thanks.


Regards,
Xinmin


Re: [Question] Does SnowflakeIO connector support for AWS and Azure?

2023-11-30 Thread Xinmin
Hello Sachin,

You said that you were able to configure SnowflakeIO with S3 bucket. Can
you please share with me the steps to configure and test it? I would really
appreciate it. Thanks.


Regards,
Xinmin


On Thu, Oct 26, 2023 at 9:42 AM mybeam  wrote:

>  Hi Sachin,
>
> Thanks for your information.
>
> On Tue, Oct 24, 2023 at 11:02 PM Sachin Mittal  wrote:
>
>> I think AWS is supported and I was able to configure snowflake io with S3
>> buckets.
>>
>>
>>
>> On Wed, 25 Oct 2023 at 9:05 AM, mybeam  wrote:
>>
>>> Hello,
>>>
>>> As per the javadoc below, only the GCP bucket is supported currently by
>>> SnowflakeIO connector? Can you please confirm that AWS and Azure are
>>> supported now? Thanks.
>>>
>>>
>>> https://beam.apache.org/releases/javadoc/2.50.0/index.html?org/apache/beam/sdk/io/snowflake/SnowflakeIO.html
>>> ,
>>>
>>