Thanks Luca,

I am still getting some error


* pyspark --packages ch.cern.sparkmeasure:spark-measure_2.12:0.17*

Python 3.7.3 (default, Mar 27 2019, 22:11:17)

[GCC 7.3.0] :: Anaconda, Inc. on linux

Type "help", "copyright", "credits" or "license" for more information.

:: loading settings :: url =
jar:file:/d4T/hduser/spark-3.1.1-bin-hadoop3.2/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml

Ivy Default Cache set to: /home/hduser/.ivy2/cache

The jars for the packages stored in: /home/hduser/.ivy2/jars

ch.cern.sparkmeasure#spark-measure_2.12 added as a dependency

:: resolving dependencies ::
org.apache.spark#spark-submit-parent-8175aada-8494-4953-a687-9c95b282c751;1.0

        confs: [default]

        found ch.cern.sparkmeasure#spark-measure_2.12;0.17 in central

        found com.fasterxml.jackson.module#jackson-module-scala_2.12;2.9.9
in central

        found com.fasterxml.jackson.core#jackson-core;2.9.9 in central

        found com.fasterxml.jackson.core#jackson-annotations;2.9.9 in
central

        found com.fasterxml.jackson.core#jackson-databind;2.9.9 in central

        found com.fasterxml.jackson.module#jackson-module-paranamer;2.9.9
in central

        found com.thoughtworks.paranamer#paranamer;2.8 in spark-list

        found org.slf4j#slf4j-api;1.7.26 in central

        found org.influxdb#influxdb-java;2.14 in central

        found com.squareup.retrofit2#retrofit;2.4.0 in central

        found com.squareup.retrofit2#converter-moshi;2.4.0 in central

        found com.squareup.moshi#moshi;1.5.0 in central

        found com.squareup.okio#okio;1.13.0 in central

        found org.msgpack#msgpack-core;0.8.16 in central

        found com.squareup.okhttp3#okhttp;3.11.0 in local-m2-cache

        found com.squareup.okio#okio;1.14.0 in local-m2-cache

        found com.squareup.okhttp3#logging-interceptor;3.11.0 in central

:: resolution report :: resolve 5349ms :: artifacts dl 2ms

        :: modules in use:

        ch.cern.sparkmeasure#spark-measure_2.12;0.17 from central in
[default]

        com.fasterxml.jackson.core#jackson-annotations;2.9.9 from central
in [default]

        com.fasterxml.jackson.core#jackson-core;2.9.9 from central in
[default]

        com.fasterxml.jackson.core#jackson-databind;2.9.9 from central in
[default]

        com.fasterxml.jackson.module#jackson-module-paranamer;2.9.9 from
central in [default]

        com.fasterxml.jackson.module#jackson-module-scala_2.12;2.9.9 from
central in [default]

        com.squareup.moshi#moshi;1.5.0 from central in [default]

        com.squareup.okhttp3#logging-interceptor;3.11.0 from central in
[default]

        com.squareup.okhttp3#okhttp;3.11.0 from local-m2-cache in [default]

        com.squareup.okio#okio;1.14.0 from local-m2-cache in [default]

        com.squareup.retrofit2#converter-moshi;2.4.0 from central in
[default]

        com.squareup.retrofit2#retrofit;2.4.0 from central in [default]

        com.thoughtworks.paranamer#paranamer;2.8 from spark-list in
[default]

        org.influxdb#influxdb-java;2.14 from central in [default]

        org.msgpack#msgpack-core;0.8.16 from central in [default]

        org.slf4j#slf4j-api;1.7.26 from central in [default]

        :: evicted modules:

        com.fasterxml.jackson.core#jackson-annotations;2.9.0 by
[com.fasterxml.jackson.core#jackson-annotations;2.9.9] in [default]

        com.squareup.okhttp3#okhttp;3.10.0 by
[com.squareup.okhttp3#okhttp;3.11.0] in [default]

        com.squareup.okio#okio;1.13.0 by [com.squareup.okio#okio;1.14.0] in
[default]


---------------------------------------------------------------------

        |                  |            modules            ||   artifacts
 |

        |       conf       | number| search|dwnlded|evicted||
number|dwnlded|


---------------------------------------------------------------------

        |      default     |   19  |   11  |   11  |   3   ||   16  |   0
 |


---------------------------------------------------------------------


:: problems summary ::

:::: ERRORS

        unknown resolver null


        SERVER ERROR: Bad Gateway url=
https://dl.bintray.com/spark-packages/maven/com/fasterxml/jackson/jackson-bom/2.9.9/jackson-bom-2.9.9.jar


        SERVER ERROR: Bad Gateway url=
https://dl.bintray.com/spark-packages/maven/com/fasterxml/jackson/jackson-base/2.9.9/jackson-base-2.9.9.jar


        unknown resolver null


I will try to investigate it



Cheers


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Thu, 23 Dec 2021 at 19:46, Luca Canali <luca.can...@cern.ch> wrote:

> Hi Mich,
>
>
>
> With Spark 3.1.1 you need to use spark-measure built with Scala 2.12:
>
>
>
> bin/pyspark --packages ch.cern.sparkmeasure:spark-measure_2.12:0.17
>
>
>
> Best,
>
> Luca
>
>
>
> *From:* Mich Talebzadeh <mich.talebza...@gmail.com>
> *Sent:* Thursday, December 23, 2021 19:59
> *To:* Luca Canali <luca.can...@cern.ch>
> *Cc:* user <user@spark.apache.org>
> *Subject:* Re: measure running time
>
>
>
> Hi Luca,
>
>
>
> Have you tested this link  https://github.com/LucaCanali/sparkMeasure
>
>
>
> With Spark 3.1.1/PySpark,   I am getting this error
>
>
>
>
>
> pyspark --packages ch.cern.sparkmeasure:spark-measure_2.11:0.17
>
>
>
> :: problems summary ::
>
> :::: ERRORS
>
>         unknown resolver null
>
>
>
>         SERVER ERROR: Bad Gateway url=
> https://dl.bintray.com/spark-packages/maven/com/fasterxml/jackson/jackson-bom/2.9.9/jackson-bom-2.9.9.jar
>
>
>
>         SERVER ERROR: Bad Gateway url=
> https://dl.bintray.com/spark-packages/maven/com/fasterxml/jackson/jackson-base/2.9.9/jackson-base-2.9.9.jar
>
>
>
> Using Python version 3.7.3 (default, Mar 27 2019 22:11:17)
>
> Spark context Web UI available at http://rhes76:4040
>
> Spark context available as 'sc' (master = local[*], app id =
> local-1640285629478).
>
> SparkSession available as 'spark'.
>
>
>
> >>> from sparkmeasure import StageMetrics
>
> >>> stagemetrics = StageMetrics(spark)
>
> Traceback (most recent call last):
>
>   File "<stdin>", line 1, in <module>
>
>   File
> "/home/hduser/anaconda3/envs/pyspark_venv/lib/python3.7/site-packages/sparkmeasure/stagemetrics.py",
> line 15, in __init__
>
>     self.stagemetrics =
> self.sc._jvm.ch.cern.sparkmeasure.StageMetrics(self.sparksession._jsparkSession)
>
>   File "/opt/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py",
> line 1569, in __call__
>
>   File "/opt/spark/python/pyspark/sql/utils.py", line 111, in deco
>
>     return f(*a, **kw)
>
>   File "/opt/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line
> 328, in get_return_value
>
> py4j.protocol.Py4JJavaError: An error occurred while calling
> None.ch.cern.sparkmeasure.StageMetrics.
>
> : java.lang.NoClassDefFoundError: scala/Product$class
>
>         at ch.cern.sparkmeasure.StageMetrics.<init>(stagemetrics.scala:111)
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
>
>         at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>
>         at py4j.Gateway.invoke(Gateway.java:238)
>
>         at
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
>
>         at
> py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
>
>         at py4j.GatewayConnection.run(GatewayConnection.java:238)
>
>         at java.lang.Thread.run(Thread.java:748)
>
> Caused by: java.lang.ClassNotFoundException: scala.Product$class
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>         ... 12 more
>
>
>
> Thanks
>
>
>
>
>
>  [image: Image removed by sender.]  view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
>
>
>
> On Thu, 23 Dec 2021 at 15:41, Luca Canali <luca.can...@cern.ch> wrote:
>
> Hi,
>
>
>
> I agree with Gourav that just measuring execution time is a simplistic
> approach that may lead you to miss important details, in particular when
> running distributed computations.
>
> WebUI, REST API, and metrics instrumentation in Spark can be quite useful
> for further drill down. See
> https://spark.apache.org/docs/latest/monitoring.html
>
> You can also have a look at this tool that takes care of automating
> collecting and aggregating some executor task metrics:
> https://github.com/LucaCanali/sparkMeasure
>
>
>
> Best,
>
> Luca
>
>
>
> *From:* Gourav Sengupta <gourav.sengu...@gmail.com>
> *Sent:* Thursday, December 23, 2021 14:23
> *To:* bit...@bitfox.top
> *Cc:* user <user@spark.apache.org>
> *Subject:* Re: measure running time
>
>
>
> Hi,
>
>
>
> I do not think that such time comparisons make any sense at all in
> distributed computation. Just saying that an operation in RDD and Dataframe
> can be compared based on their start and stop time may not provide any
> valid information.
>
>
>
> You will have to look into the details of timing and the steps. For
> example, please look at the SPARK UI to see how timings are calculated in
> distributed computing mode, there are several well written papers on this.
>
>
>
>
>
> Thanks and Regards,
>
> Gourav Sengupta
>
>
>
>
>
>
>
>
>
>
>
> On Thu, Dec 23, 2021 at 10:57 AM <bit...@bitfox.top> wrote:
>
> hello community,
>
> In pyspark how can I measure the running time to the command?
> I just want to compare the running time of the RDD API and dataframe
> API, in my this blog:
>
> https://bitfoxtop.wordpress.com/2021/12/23/count-email-addresses-using-sparks-rdd-and-dataframe/
>
> I tried spark.time() it doesn't work.
> Thank you.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to