Hi,

There no way to retrieve that information in spark.
In fact,  the current optimizer only consider the byte size of outputs in
LogicalPlan.
Related code can be found in
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/LogicalPlan.scala#L90

If you want to know more about catalyst, you can check the Yin Huai's slide
in spark summit 2016.
https://spark-summit.org/2016/speakers/yin-huai/
# Note: the slide is not available now, and it seems it will be in a few
weeks.

// maropu


On Fri, Jun 10, 2016 at 3:29 PM, Srinivasan Hariharan02 <
srinivasan_...@infosys.com> wrote:

> Hi,,
>
> How can I get spark sql query cpu and Io cost after optimizing for the
> best logical plan. Is there any api to retrieve this information?. If
> anyone point me to the code where actually cpu and Io cost computed in
> catalyst module.
>
> *Regards,*
> *Srinivasan Hariharan*
> *+91-9940395830 <%2B91-9940395830>*
>
>
>
>



-- 
---
Takeshi Yamamuro

Reply via email to