Hi Mich,

 

With Spark 3.1.1 you need to use spark-measure built with Scala 2.12:  

 

bin/pyspark --packages ch.cern.sparkmeasure:spark-measure_2.12:0.17

 

Best,

Luca

 

From: Mich Talebzadeh <mich.talebza...@gmail.com> 
Sent: Thursday, December 23, 2021 19:59
To: Luca Canali <luca.can...@cern.ch>
Cc: user <user@spark.apache.org>
Subject: Re: measure running time

 

Hi Luca,

 

Have you tested this link  https://github.com/LucaCanali/sparkMeasure

 

With Spark 3.1.1/PySpark,   I am getting this error 

 

 

pyspark --packages ch.cern.sparkmeasure:spark-measure_2.11:0.17

 

:: problems summary ::

:::: ERRORS

        unknown resolver null

 

        SERVER ERROR: Bad Gateway 
url=https://dl.bintray.com/spark-packages/maven/com/fasterxml/jackson/jackson-bom/2.9.9/jackson-bom-2.9.9.jar

 

        SERVER ERROR: Bad Gateway 
url=https://dl.bintray.com/spark-packages/maven/com/fasterxml/jackson/jackson-base/2.9.9/jackson-base-2.9.9.jar

 

Using Python version 3.7.3 (default, Mar 27 2019 22:11:17)

Spark context Web UI available at http://rhes76:4040

Spark context available as 'sc' (master = local[*], app id = 
local-1640285629478).

SparkSession available as 'spark'.

 

>>> from sparkmeasure import StageMetrics

>>> stagemetrics = StageMetrics(spark)

Traceback (most recent call last):

  File "<stdin>", line 1, in <module>

  File 
"/home/hduser/anaconda3/envs/pyspark_venv/lib/python3.7/site-packages/sparkmeasure/stagemetrics.py",
 line 15, in __init__

    self.stagemetrics = 
self.sc._jvm.ch.cern.sparkmeasure.StageMetrics(self.sparksession._jsparkSession)

  File "/opt/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 
1569, in __call__

  File "/opt/spark/python/pyspark/sql/utils.py", line 111, in deco

    return f(*a, **kw)

  File "/opt/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 328, 
in get_return_value

py4j.protocol.Py4JJavaError: An error occurred while calling 
None.ch.cern.sparkmeasure.StageMetrics.

: java.lang.NoClassDefFoundError: scala/Product$class

        at ch.cern.sparkmeasure.StageMetrics.<init>(stagemetrics.scala:111)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)

        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)

        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)

        at py4j.Gateway.invoke(Gateway.java:238)

        at 
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)

        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)

        at py4j.GatewayConnection.run(GatewayConnection.java:238)

        at java.lang.Thread.run(Thread.java:748)

Caused by: java.lang.ClassNotFoundException: scala.Product$class

        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 12 more

 

Thanks

 

 

   view my Linkedin profile 
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/> 

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed. The author 
will in no case be liable for any monetary damages arising from such loss, 
damage or destruction. 

 

 

 

On Thu, 23 Dec 2021 at 15:41, Luca Canali <luca.can...@cern.ch 
<mailto:luca.can...@cern.ch> > wrote:

Hi,

 

I agree with Gourav that just measuring execution time is a simplistic approach 
that may lead you to miss important details, in particular when running 
distributed computations.

WebUI, REST API, and metrics instrumentation in Spark can be quite useful for 
further drill down. See https://spark.apache.org/docs/latest/monitoring.html

You can also have a look at this tool that takes care of automating collecting 
and aggregating some executor task metrics: 
https://github.com/LucaCanali/sparkMeasure

 

Best,

Luca

 

From: Gourav Sengupta <gourav.sengu...@gmail.com 
<mailto:gourav.sengu...@gmail.com> > 
Sent: Thursday, December 23, 2021 14:23
To: bit...@bitfox.top
Cc: user <user@spark.apache.org <mailto:user@spark.apache.org> >
Subject: Re: measure running time

 

Hi,

 

I do not think that such time comparisons make any sense at all in distributed 
computation. Just saying that an operation in RDD and Dataframe can be compared 
based on their start and stop time may not provide any valid information.

 

You will have to look into the details of timing and the steps. For example, 
please look at the SPARK UI to see how timings are calculated in distributed 
computing mode, there are several well written papers on this.

 

 

Thanks and Regards,

Gourav Sengupta

 

 

 

 

 

On Thu, Dec 23, 2021 at 10:57 AM <bit...@bitfox.top <mailto:bit...@bitfox.top> 
> wrote:

hello community,

In pyspark how can I measure the running time to the command?
I just want to compare the running time of the RDD API and dataframe 
API, in my this blog:
https://bitfoxtop.wordpress.com/2021/12/23/count-email-addresses-using-sparks-rdd-and-dataframe/

I tried spark.time() it doesn't work.
Thank you.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org 
<mailto:user-unsubscr...@spark.apache.org> 

Reply via email to