This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new dd7a789  Add a note about Spark build requirement at PySpark testing 
guide in Developer Tools
dd7a789 is described below

commit dd7a7891f7346066f3db3e5654502eebab42128b
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Tue Dec 18 10:32:33 2018 +0800

    Add a note about Spark build requirement at PySpark testing guide in 
Developer Tools
    
    Closes #162
---
 developer-tools.md        | 2 ++
 site/developer-tools.html | 4 +++-
 2 files changed, 5 insertions(+), 1 deletion(-)

diff --git a/developer-tools.md b/developer-tools.md
index ebe6905..43ad445 100644
--- a/developer-tools.md
+++ b/developer-tools.md
@@ -131,6 +131,8 @@ build/mvn test -DwildcardSuites=none 
-Dtest=org.apache.spark.streaming.JavaAPISu
 <h4>Testing PySpark</h4>
 
 To run individual PySpark tests, you can use `run-tests` script under `python` 
directory. Test cases are located at `tests` package under each PySpark 
packages.
+Note that, if you add some changes into Scala or Python side in Apache Spark, 
you need to manually build Apache Spark again before running PySpark tests in 
order to apply the changes.
+Running PySpark testing script does not automatically build it.
 
 To run test cases in a specific module:
 
diff --git a/site/developer-tools.html b/site/developer-tools.html
index 82dab67..710f6f5 100644
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@ -313,7 +313,9 @@ $ build/mvn package -DskipTests -pl core
 
 <h4>Testing PySpark</h4>
 
-<p>To run individual PySpark tests, you can use <code>run-tests</code> script 
under <code>python</code> directory. Test cases are located at 
<code>tests</code> package under each PySpark packages.</p>
+<p>To run individual PySpark tests, you can use <code>run-tests</code> script 
under <code>python</code> directory. Test cases are located at 
<code>tests</code> package under each PySpark packages.
+Note that, if you add some changes into Scala or Python side in Apache Spark, 
you need to manually build Apache Spark again before running PySpark tests in 
order to apply the changes.
+Running PySpark testing script does not automatically build it.</p>
 
 <p>To run test cases in a specific module:</p>
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to