Hi,
You can use pysparkling => https://github.com/svenkreiss/pysparkling
This lib is useful in case you have RDD.

Hope this helps,

Hichame

From: mmistr...@gmail.com
Sent: February 3, 2019 4:42 PM
To: radams...@gmail.com
Cc: la...@mapflat.com; bpru...@opentext.com; user@spark.apache.org
Subject: Re: testing frameworks


Hi
 sorry to resurrect this thread
Any spark libraries for testing code in pyspark?  the github code above seems 
related to Scala
following links in the original threads (and also LMGFY) i found out
<https://pypi.org/project/pytest-spark/>
pytest-spark ยท PyPI


w/kindest regards
 Marco




On Tue, Jun 12, 2018 at 6:44 PM Ryan Adams 
<radams...@gmail.com<mailto:radams...@gmail.com>> wrote:
We use spark testing base for unit testing.  These tests execute on a very 
small amount of data that covers all paths the code can take (or most paths 
anyway).

https://github.com/holdenk/spark-testing-base

For integration testing we use automated routines to ensure that aggregate 
values match an aggregate baseline.

Ryan

Ryan Adams
radams...@gmail.com<mailto:radams...@gmail.com>

On Tue, Jun 12, 2018 at 11:51 AM, Lars Albertsson 
<la...@mapflat.com<mailto:la...@mapflat.com>> wrote:
Hi,

I wrote this answer to the same question a couple of years ago:
https://www.mail-archive.com/user%40spark.apache.org/msg48032.html

I have made a couple of presentations on the subject. Slides and video
are linked on this page: http://www.mapflat.com/presentations/

You can find more material in this list of resources:
http://www.mapflat.com/lands/resources/reading-list

Happy testing!

Regards,



Lars Albertsson
Data engineering consultant
www.mapflat.com<http://www.mapflat.com>
https://twitter.com/lalleal
+46 70 7687109
Calendar: http://www.mapflat.com/calendar


On Mon, May 21, 2018 at 2:24 PM, Steve Pruitt 
<bpru...@opentext.com<mailto:bpru...@opentext.com>> wrote:
> Hi,
>
>
>
> Can anyone recommend testing frameworks suitable for Spark jobs.  Something
> that can be integrated into a CI tool would be great.
>
>
>
> Thanks.
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>


Reply via email to