Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=58&rev2=59

Comment:
Updated with latest Maven changes

    * You can run all the Common unit tests with {{{mvn test}}}, or a specific 
unit test with {{{mvn -Dtest=<class name without package prefix> test}}}. Run 
these commands from the {{{hadoop-trunk}}} directory.
    * For HDFS and MapReduce, you can run all the unit tests with the command 
{{{ant test}}}, or you can run a specific unit test with the command {{{ant 
-Dtestcase=<class name without package prefix> test}}} (for example {{{ant 
-Dtestcase=TestFileSystem test}}})
  
- ==== Using Ant ====
- Hadoop HDFS and MapReduce are built by Ant, a Java building tool. (Common is 
build using Maven, see below.) This section will eventually describe how Ant is 
used within Hadoop.  To start, simply read a good Ant tutorial.  The following 
is a good tutorial, though keep in mind that Hadoop isn't structured according 
to the ways outlined in the tutorial.  Use the tutorial to get a basic 
understand of Ant but not to understand how Ant is used for Hadoop:
- 
-  * Good Ant tutorial: http://i-proving.ca/space/Technologies/Ant+Tutorial
- 
- Although most Java IDEs ship with a version of Ant, having a command line 
version installed is invaluable. You can download a version from 
http://ant.apache.org/.
- 
- After installing Ant, you must make sure that its networking support is 
configured for any proxy you have. Without that the build will not work, as the 
Hadoop builds will not be able to download their dependencies using 
[[http://ant.apache.org/ivy/|Ivy]].
- 
- Tip: to see how Ant is set up, run
- 
- {{{
- ant -diagnostics
- }}}
- 
  ==== Using Maven ====
- Hadoop Common is built using Maven. You need to use version 3 or later.
+ Hadoop built using [[http://maven.apache.org/|Apache Maven]], version 3 or 
later. (Parts of MapReduce are still built using Ant, see the instructions in 
the {{{INSTALL}}} file in {{{hadoop-mapreduce}}} for details.)
  
  === Generating a patch ===
  ==== Unit Tests ====
  Please make sure that all unit tests succeed before constructing your patch 
and that no new javac compiler warnings are introduced by your patch.
  
+ For building Hadoop with Maven, use the following to run all unit tests and 
build a distribution. The {{{-Ptest-patch}}} profile will check that no new 
compiler warnings have been introduced by your patch.
- {{{
- > cd hadoop-common-trunk
- > ant -Djavac.args="-Xlint -Xmaxwarns 1000" clean test tar
- }}}
- After a while, if you see
  
  {{{
- BUILD SUCCESSFUL
+ mvn clean install -Pdist -Dtar -Ptest-patch
  }}}
- all is ok, but if you see
  
+ Any test failures can be found in the {{{target/surefire-reports}}} directory 
of the relevant module. You can also run this command in one of the 
{{{hadoop-common}}}, {{{hadoop-hdfs}}}, or {{{hadoop-mapreduce}}} directories 
to just test a particular subproject.
- {{{
- BUILD FAILED
- }}}
- then please examine error messages in {{{build/test}}} and fix things before 
proceeding.
  
  Unit tests development guidelines HowToDevelopUnitTests
- 
- For building Hadoop Common with Maven, use the following to run all unit 
tests and build a distribution. The {{{-Ptest-patch}}} profile will check that 
no new compiler warnings have been introduced by your patch.
- 
- {{{
- mvn clean install -Ptar -Ptest-patch
- }}}
- 
- Any test failures can be found in {{{hadoop-common/target/surefire-reports}}}.
  
  ==== Javadoc ====
  Please also check the javadoc.
  
  {{{
- > ant javadoc
- > firefox build/docs/api/index.html
+ mvn javadoc:javadoc
+ firefox target/site/api/index.html
  }}}
  Examine all public classes you've changed to see that documentation is 
complete, informative, and properly formatted.  Your patch must not generate 
any javadoc warnings.
- 
- For Common, use Maven to build the javadoc as follows:
- {{{
- mvn javadoc:javadoc
- firefox hadoop-common/target/site/api/index.html
- }}}
  
  ==== Creating a patch ====
  Check to see what files you have modified with:
@@ -150, +113 @@

  This way other developers can preview your change by running the script and 
then applying the patch.
  
  ==== Testing your patch ====
- Before submitting your patch, you are encouraged to run the same tools that 
the automated Hudson patch test system will run on your patch.  This enables 
you to fix problems with your patch before you submit it.  The {{{test-patch}}} 
Ant target will run your patch through the same checks that Hudson currently 
does ''except'' for executing the core and contrib unit tests.
+ Before submitting your patch, you are encouraged to run the same tools that 
the automated Jenkins patch test system will run on your patch.  This enables 
you to fix problems with your patch before you submit it. The 
{{{dev-support/test-patch.sh}}} script in the trunk directory will run your 
patch through the same checks that Hudson currently does ''except'' for 
executing the unit tests.
  
- To use this target, you must run it from a clean workspace (ie {{{svn stat}}} 
shows no modifications or additions).  From your clean workspace, run:
+ Run this command from a clean workspace (ie {{{svn stat}}} shows no 
modifications or additions) as follows:
  
  {{{
+ dev-support/test-patch.sh /path/to/my.patch
- ant \
-   -Dpatch.file=/path/to/my.patch \
-   -Dforrest.home=/path/to/forrest/ \
-   -Dfindbugs.home=/path/to/findbugs \
-   -Dscratch.dir=/path/to/a/temp/dir \ (optional)
-   -Dsvn.cmd=/path/to/subversion/bin/svn \ (optional)
-   -Dgrep.cmd=/path/to/grep \ (optional)
-   -Dpatch.cmd=/path/to/patch \ (optional)
-   test-patch
  }}}
+ 
- At the end, you should get a message on your console that is similar to the 
comment added to Jira by Hudson's automated patch test system.  The scratch 
directory (which defaults to the value of {{{${user.home}/tmp}}}) will contain 
some output files that will be useful in determining what issues were found in 
the patch.
+ At the end, you should get a message on your console that is similar to the 
comment added to Jira by Jenkins's automated patch test system.  The scratch 
directory (which defaults to the value of {{{${user.home}/tmp}}}) will contain 
some output files that will be useful in determining what issues were found in 
the patch.
  
  Some things to note:
  
@@ -173, +129 @@

   * the {{{grep}}} command must support the -o flag (GNU does)
   * the {{{patch}}} command must support the -E flag
   * you may need to explicitly set ANT_HOME.  Running {{{ant -diagnostics}}} 
will tell you the default value on your system.
- 
- For testing a patch in Hadoop Common, use the following command, run from the 
top-level ({{{hadoop-trunk}}}) checkout:
- {{{
- dev-support/test-patch.sh /path/to/my.patch
- }}}
  
  Run the same command with no arguments to see the usage options.
  
@@ -219, +170 @@

  
  When you believe that your patch is ready to be committed, select the 
'''Submit Patch''' link on the issue's Jira.  Submitted patches will be 
automatically tested against "trunk" by 
[[http://hudson.zones.apache.org/hudson/view/Hadoop/|Hudson]], the project's 
continuous integration engine.  Upon test completion, Hudson will add a success 
("+1") message or failure ("-1") to your issue report in Jira.  If your issue 
contains multiple patch versions, Hudson tests the last patch uploaded.
  
- Folks should run {{{ant clean test javadoc checkstyle}}} (or {{{mvn clean 
install javadoc:javadoc checkstyle:checkstyle}}} in the case of Common) before 
selecting '''Submit Patch'''.  Tests should all pass.  Javadoc should report 
'''no''' warnings or errors. Checkstyle's error count should not exceed that 
listed at 
[[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle
 Errors]]  Hudson's tests are meant to double-check things, and not be used as 
a primary patch tester, which would create too much noise on the mailing list 
and in Jira.  Submitting patches that fail Hudson testing is frowned on, 
(unless the failure is not actually due to the patch).
+ Folks should run {{{ant clean test javadoc checkstyle}}} (or {{{mvn clean 
install javadoc:javadoc checkstyle:checkstyle}}} in the case of Common or HDFS) 
before selecting '''Submit Patch'''.  Tests should all pass.  Javadoc should 
report '''no''' warnings or errors. Checkstyle's error count should not exceed 
that listed at 
[[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/lastSuccessfulBuild/artifact/trunk/build/test/checkstyle-errors.html|Checkstyle
 Errors]]  Hudson's tests are meant to double-check things, and not be used as 
a primary patch tester, which would create too much noise on the mailing list 
and in Jira.  Submitting patches that fail Hudson testing is frowned on, 
(unless the failure is not actually due to the patch).
  
  If your patch involves performance optimizations, they should be validated by 
benchmarks that demonstrate an improvement.
  

Reply via email to