-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
And to be clear. Yes, execution plans show what exactly it's doing. The
problem is that it's unclear how it's related to the actual Scala/Python
code.
On 7/21/20 15:45, Michal Sankot wrote:
Yes, the problem is that DAGs only refer to code line (action) that
inovked it. It doesn't provide
it's actually
doing.
On 7/21/20 15:36, Russell Spitzer wrote:
Have you looked in the DAG visualization? Each block refer to the code
line invoking it.
For Dataframes the execution plan will let you know explicitly which
operations are in which stages.
On Tue, Jul 21, 2020, 8:18 AM Michal
Hi,
when I analyze and debug our Spark batch jobs executions it's a pain to
find out how blocks in Spark UI Jobs/SQL tab correspond to the actual
Scala code that we write and how much time they take. Would there be a
way to somehow instruct compiler or something and get this information
into