Have you tried looking at Spark GUI to see the time it takes to load from
HDFS?

Spark GUI by default runs on port 4040. However, you can set in spark-submit

${SPARK_HOME}/bin/spark-submit  \

…...
--conf "spark.ui.port=xxxx"

and access it through hostname:port

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Wed, 10 Apr 2019 at 18:56, yeikel valdes <em...@yeikel.com> wrote:

> What about a simple call to nanotime?
>
> long startTime = System.nanoTime();
>
> //Spark work here
>
> long endTime = System.nanoTime();
>
> long duration = (endTime - startTime)
>
> println(duration)
>
> Count recomputes the df so it makes sense it takes longer for you.
>
>
> ---- On Tue, 02 Apr 2019 07:06:30 -0700 * koloka...@ics.forth.gr
> <koloka...@ics.forth.gr> * wrote ----
>
> Hello,
>
> I want to ask if there any way to measure HDFS data loading time at
> the start of my program. I tried to add an action e.g count() after val
> data = sc.textFile() call. But I notice that my program takes more time
> to finish than before adding count call. Is there any other way to do it ?
>
> Thanks,
> --Iacovos
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>
>

Reply via email to