Hi,

We have an automatic report creation tool that creates Spark SQL jobs
based on user instruction (this is a web application). We'd like to
give the users an opportunity to visualize the execution plan of their
handiwork before them inflicting it to the world.

Currently, I'm just capturing the output of an `explain` statement and
displaying it, but it's still too cryptic for users, so we'd love to
have something similar to the SQL tab in the Spark UI; from my
research, the beautiful SVG displayed there is built from a dot file
which is built from aggregated metrics, which are gathered when the
job is executing. This appears not to be what I need.

Since I'm just getting familiar with Spark's extremely complex
codebase, do you guys have any pointers or ideas on how can I provide
a more user friendly physical execution plan to my users?

Regards,
Leonardo Herrera

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to