Repository: spark-website
Updated Branches:
  refs/heads/asf-site 879303593 -> ca64fac2e


Add instructions for running individual tests.

This is useful and I often forget how to do it.  I learned some
new tricks when @squito gave @jinxing64 some tips on how to do
this, so I thought it was worth adding this to the website.


Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/ca64fac2
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/ca64fac2
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/ca64fac2

Branch: refs/heads/asf-site
Commit: ca64fac2e24256dc3a07711e004c540b892965fe
Parents: 8793035
Author: Kay Ousterhout <kayousterh...@gmail.com>
Authored: Sat Feb 11 18:38:46 2017 -0800
Committer: Sean Owen <so...@cloudera.com>
Committed: Wed Feb 22 05:48:18 2017 -0800

----------------------------------------------------------------------
 developer-tools.md        | 81 +++++++++++++++++++++++++++++++++++++++++-
 site/developer-tools.html | 72 ++++++++++++++++++++++++++++++++++++-
 2 files changed, 151 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark-website/blob/ca64fac2/developer-tools.md
----------------------------------------------------------------------
diff --git a/developer-tools.md b/developer-tools.md
index e8853b8..88f3f36 100644
--- a/developer-tools.md
+++ b/developer-tools.md
@@ -9,7 +9,9 @@ navigation:
 
 <h2>Useful Developer Tools</h2>
 
-<h3>Reducing Build Times</h3>
+<h3 id="reducing-build-times">Reducing Build Times</h3>
+
+<h4>SBT: Avoiding Re-Creating the Assembly JAR</h4>
 
 Spark's default build strategy is to assemble a jar including all of its 
dependencies. This can 
 be cumbersome when doing iterative development. When developing locally, it is 
possible to create 
@@ -32,6 +34,83 @@ $ ./bin/spark-shell
 $ build/sbt ~compile
 ```
 
+<h4>Maven: Speeding up Compilation with Zinc</h4>
+
+[Zinc](https://github.com/typesafehub/zinc) is a long-running server version 
of SBT's incremental
+compiler. When run locally as a background process, it speeds up builds of 
Scala-based projects
+like Spark. Developers who regularly recompile Spark with Maven will be the 
most interested in
+Zinc. The project site gives instructions for building and running `zinc`; OS 
X users can
+install it using `brew install zinc`.
+
+If using the `build/mvn` package `zinc` will automatically be downloaded and 
leveraged for all
+builds. This process will auto-start after the first time `build/mvn` is 
called and bind to port
+3030 unless the `ZINC_PORT` environment variable is set. The `zinc` process 
can subsequently be
+shut down at any time by running `build/zinc-<version>/bin/zinc -shutdown` and 
will automatically
+restart whenever `build/mvn` is called.
+
+<h3 id="running-individual-tests">Running Individual Tests</h3>
+
+When developing locally, it's often convenient to run a single test or a few 
tests, rather than running the entire test suite.
+
+<h4>Testing with SBT</h4>
+
+The fastest way to run individual tests is to use the `sbt` console. It's 
fastest to keep a `sbt` console open, and use it to re-run tests as necessary.  
For example, to run all of the tests in a particular project, e.g., `core`:
+
+```
+$ build/sbt
+> project core
+> test
+```
+
+You can run a single test suite using the `testOnly` command.  For example, to 
run the DAGSchedulerSuite:
+
+```
+> testOnly org.apache.spark.scheduler.DAGSchedulerSuite
+```
+
+The `testOnly` command accepts wildcards; e.g., you can also run the 
`DAGSchedulerSuite` with:
+
+```
+> testOnly *DAGSchedulerSuite
+```
+
+Or you could run all of the tests in the scheduler package:
+
+```
+> testOnly org.apache.spark.scheduler.*
+```
+
+If you'd like to run just a single test in the `DAGSchedulerSuite`, e.g., a 
test that includes "SPARK-12345" in the name, you run the following command in 
the sbt console:
+
+```
+> testOnly *DAGSchedulerSuite -- -z "SPARK-12345"
+```
+
+If you'd prefer, you can run all of these commands on the command line (but 
this will be slower than running tests using an open cosole).  To do this, you 
need to surround `testOnly` and the following arguments in quotes:
+
+```
+$ build/sbt "core/testOnly *DAGSchedulerSuite -- -z SPARK-12345"
+```
+
+For more about how to run individual tests with sbt, see the [sbt 
documentation](http://www.scala-sbt.org/0.13/docs/Testing.html).
+
+
+<h4>Testing with Maven</h4>
+
+With Maven, you can use the `-DwildcardSuites` flag to run individual Scala 
tests:
+
+```
+build/mvn -Dtest=none 
-DwildcardSuites=org.apache.spark.scheduler.DAGSchedulerSuite test
+```
+
+You need `-Dtest=none` to avoid running the Java tests.  For more information 
about the ScalaTest Maven Plugin, refer to the [ScalaTest 
documentation](http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin).
+
+To run individual Java tests, you can use the `-Dtest` flag:
+
+```
+build/mvn test -DwildcardSuites=none 
-Dtest=org.apache.spark.streaming.JavaAPISuite test
+```
+
 <h3>Checking Out Pull Requests</h3>
 
 Git provides a mechanism for fetching remote pull requests into your own local 
repository. 

http://git-wip-us.apache.org/repos/asf/spark-website/blob/ca64fac2/site/developer-tools.html
----------------------------------------------------------------------
diff --git a/site/developer-tools.html b/site/developer-tools.html
index ed84669..615adea 100644
--- a/site/developer-tools.html
+++ b/site/developer-tools.html
@@ -194,7 +194,9 @@
   <div class="col-md-9 col-md-pull-3">
     <h2>Useful Developer Tools</h2>
 
-<h3>Reducing Build Times</h3>
+<h3 id="reducing-build-times">Reducing Build Times</h3>
+
+<h4>SBT: Avoiding Re-Creating the Assembly JAR</h4>
 
 <p>Spark&#8217;s default build strategy is to assemble a jar including all of 
its dependencies. This can 
 be cumbersome when doing iterative development. When developing locally, it is 
possible to create 
@@ -216,6 +218,74 @@ $ ./bin/spark-shell
 $ build/sbt ~compile
 </code></pre>
 
+<h4>Maven: Speeding up Compilation with Zinc</h4>
+
+<p><a href="https://github.com/typesafehub/zinc";>Zinc</a> is a long-running 
server version of SBT&#8217;s incremental
+compiler. When run locally as a background process, it speeds up builds of 
Scala-based projects
+like Spark. Developers who regularly recompile Spark with Maven will be the 
most interested in
+Zinc. The project site gives instructions for building and running 
<code>zinc</code>; OS X users can
+install it using <code>brew install zinc</code>.</p>
+
+<p>If using the <code>build/mvn</code> package <code>zinc</code> will 
automatically be downloaded and leveraged for all
+builds. This process will auto-start after the first time 
<code>build/mvn</code> is called and bind to port
+3030 unless the <code>ZINC_PORT</code> environment variable is set. The 
<code>zinc</code> process can subsequently be
+shut down at any time by running <code>build/zinc-&lt;version&gt;/bin/zinc 
-shutdown</code> and will automatically
+restart whenever <code>build/mvn</code> is called.</p>
+
+<h3 id="running-individual-tests">Running Individual Tests</h3>
+
+<p>When developing locally, it&#8217;s often convenient to run a single test 
or a few tests, rather than running the entire test suite.</p>
+
+<h4>Testing with SBT</h4>
+
+<p>The fastest way to run individual tests is to use the <code>sbt</code> 
console. It&#8217;s fastest to keep a <code>sbt</code> console open, and use it 
to re-run tests as necessary.  For example, to run all of the tests in a 
particular project, e.g., <code>core</code>:</p>
+
+<pre><code>$ build/sbt
+&gt; project core
+&gt; test
+</code></pre>
+
+<p>You can run a single test suite using the <code>testOnly</code> command.  
For example, to run the DAGSchedulerSuite:</p>
+
+<pre><code>&gt; testOnly org.apache.spark.scheduler.DAGSchedulerSuite
+</code></pre>
+
+<p>The <code>testOnly</code> command accepts wildcards; e.g., you can also run 
the <code>DAGSchedulerSuite</code> with:</p>
+
+<pre><code>&gt; testOnly *DAGSchedulerSuite
+</code></pre>
+
+<p>Or you could run all of the tests in the scheduler package:</p>
+
+<pre><code>&gt; testOnly org.apache.spark.scheduler.*
+</code></pre>
+
+<p>If you&#8217;d like to run just a single test in the 
<code>DAGSchedulerSuite</code>, e.g., a test that includes 
&#8220;SPARK-12345&#8221; in the name, you run the following command in the sbt 
console:</p>
+
+<pre><code>&gt; testOnly *DAGSchedulerSuite -- -z "SPARK-12345"
+</code></pre>
+
+<p>If you&#8217;d prefer, you can run all of these commands on the command 
line (but this will be slower than running tests using an open cosole).  To do 
this, you need to surround <code>testOnly</code> and the following arguments in 
quotes:</p>
+
+<pre><code>$ build/sbt "core/testOnly *DAGSchedulerSuite -- -z SPARK-12345"
+</code></pre>
+
+<p>For more about how to run individual tests with sbt, see the <a 
href="http://www.scala-sbt.org/0.13/docs/Testing.html";>sbt 
documentation</a>.</p>
+
+<h4>Testing with Maven</h4>
+
+<p>With Maven, you can use the <code>-DwildcardSuites</code> flag to run 
individual Scala tests:</p>
+
+<pre><code>build/mvn -Dtest=none 
-DwildcardSuites=org.apache.spark.scheduler.DAGSchedulerSuite test
+</code></pre>
+
+<p>You need <code>-Dtest=none</code> to avoid running the Java tests.  For 
more information about the ScalaTest Maven Plugin, refer to the <a 
href="http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin";>ScalaTest
 documentation</a>.</p>
+
+<p>To run individual Java tests, you can use the <code>-Dtest</code> flag:</p>
+
+<pre><code>build/mvn test -DwildcardSuites=none 
-Dtest=org.apache.spark.streaming.JavaAPISuite test
+</code></pre>
+
 <h3>Checking Out Pull Requests</h3>
 
 <p>Git provides a mechanism for fetching remote pull requests into your own 
local repository. 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to