Can’t you attach the cross account permission to the glue job role? Why the
detour via AssumeRole ?
Assumerole can make sense if you use an AWS IAM user and STS authentication,
but this would make no sense within AWS for cross-account access as attaching
the permissions to the Glue job role is
alos remove the space in rev. scode
søn. 22. okt. 2023 kl. 19:08 skrev Sadha Chilukoori :
> Hi Meena,
>
> I'm asking to clarify, are the *on *& *and* keywords optional in the join
> conditions?
>
> Please try this snippet, and see if it helps
>
> select rev.* from rev
> inner join customer c
> on
Hi Meena,
I'm asking to clarify, are the *on *& *and* keywords optional in the join
conditions?
Please try this snippet, and see if it helps
select rev.* from rev
inner join customer c
on rev.custumer_id =c.id
inner join product p
on rev.sys = p.sys
and rev.prin = p.prin
and rev.scode= p.bcode
Hi Meena,
It's not impossible, but it's unlikely that there's a bug in Spark SQL
randomly duplicating rows. The most likely explanation is there are more
records in the item table that match your sys/custumer_id/scode criteria
than you expect.
In your original query, try changing select rev.* to
hi all,
i've a scenario where I need to assume a cross account role to have S3 bucket
access.
the problem is that this role only allows for 1h time span (no negotiation).
that said.
does anyone know a way to tell spark to automatically renew the token
or to dinamically renew the token on each n