Thanks a lot for the reply Albert.
On looking at it and reading about it further - I do see that
"AdaptiveSparkPlan isFinalPlan=false" is mentioned.
Could you point me to how I can see the final plan ? I couldn't find that
in any of the resources I was referring to
On Fri, 7 Jan 2022, 07:25 Albe
I happen to encounter something similar.
it's probably because you are just `explain` it. when you actually `run`
it. you will get the final spark plan in which case the exchange will be
reused.
right, this is different compared with 3.1 probably because the upgraded
aqe.
not sure whether this is
Just thought I'd do a quick bump and add the dev mailing list - in case
there is some insight there
Feels like this should be categorized as a bug for spark 3.2.0
On Wed, Dec 29, 2021 at 5:25 PM Abdeali Kothari
wrote:
> Hi,
> I am using pyspark for some projects. And one of the things we are doi
Hi,
I am using pyspark for some projects. And one of the things we are doing is
trying to find the tables/columns being used by Spark using the execution
plan.
When we upgrade to spark 3.2 - the spark plan seems to be different from
previous versions - mainly when we are doing joins.
Below is a re