[Hadoop Wiki] Update of "HowToContribute" by TomWhite

2011-07-28 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=54&rev2=55

Comment:
Added Maven equivalent commands for when HADOOP-6671 is committed

   * Contributions should pass existing unit tests.
   * New unit tests should be provided to demonstrate bugs and fixes.  
[[http://www.junit.org|JUnit]] is our test framework:
* You must implement a class that uses {{{@Test}}} annotations for all test 
methods. Please note, 
[[http://wiki.apache.org/hadoop/HowToDevelopUnitTests|Hadoop uses JUnit v4]].
-   * Define methods within your class whose names begin with {{{test}}}, and 
call JUnit's many assert methods to verify conditions; these methods will be 
executed when you run {{{ant test}}}. Please add meaningful messages to the 
assert statement to facilitate diagnostics.
+   * Define methods within your class whose names begin with {{{test}}}, and 
call JUnit's many assert methods to verify conditions; these methods will be 
executed when you run {{{ant test}}} (or {{{mvn test}}}). Please add meaningful 
messages to the assert statement to facilitate diagnostics.
* By default, do not let tests write any temporary files to {{{/tmp}}}.  
Instead, the tests should write to the location specified by the 
{{{test.build.data}}} system property.
* If a HDFS cluster or a MapReduce cluster is needed by your test, please 
use {{{org.apache.hadoop.dfs.MiniDFSCluster}}} and 
{{{org.apache.hadoop.mapred.MiniMRCluster}}}, respectively.  
{{{TestMiniMRLocalFS}}} is an example of a test that uses {{{MiniMRCluster}}}.
* Place your class in the {{{src/test}}} tree.
* {{{TestFileSystem.java}}} and {{{TestMapRed.java}}} are examples of 
standalone MapReduce-based tests.
* {{{TestPath.java}}} is an example of a non MapReduce-based test.
-   * You can run all the unit test with the command {{{ant test}}}, or you can 
run a specific unit test with the command {{{ant -Dtestcase= test}}} (for example {{{ant -Dtestcase=TestFileSystem test}}})
+   * You can run all the unit tests with the command {{{ant test}}}, or you 
can run a specific unit test with the command {{{ant -Dtestcase= test}}} (for example {{{ant -Dtestcase=TestFileSystem 
test}}})
+* '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]''' You can run all the Common unit tests with {{{mvn test}}}, or a 
specific unit test with {{{mvn -Dtest= 
test}}}.
  
   Using Ant 
  Hadoop is built by Ant, a Java building tool.  This section will eventually 
describe how Ant is used within Hadoop.  To start, simply read a good Ant 
tutorial.  The following is a good tutorial, though keep in mind that Hadoop 
isn't structured according to the ways outlined in the tutorial.  Use the 
tutorial to get a basic understand of Ant but not to understand how Ant is used 
for Hadoop:
@@ -58, +59 @@

  {{{
  ant -diagnostics
  }}}
+ 
+  Using Maven 
+ '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
+ Hadoop Common is built using Maven. You need to use version 3 or later.
+ 
  === Generating a patch ===
   Unit Tests 
  Please make sure that all unit tests succeed before constructing your patch 
and that no new javac compiler warnings are introduced by your patch.
@@ -80, +86 @@

  
  Unit tests development guidelines HowToDevelopUnitTests
  
+ '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
+ For building Hadoop Common with Maven, use the following to run all unit 
tests and build a distribution. The {{{-Ptest-patch}}} profile will check that 
no new compiler warnings have been introduced by your patch.
+ 
+ {{{
+ mvn clean install -Ptar -Ptest-patch
+ }}}
+ 
+ Any test failures can be found in {{{hadoop-common/target/surefire-reports}}}.
+ 
   Javadoc 
  Please also check the javadoc.
  
@@ -88, +103 @@

  > firefox build/docs/api/index.html
  }}}
  Examine all public classes you've changed to see that documentation is 
complete, informative, and properly formatted.  Your patch must not generate 
any javadoc warnings.
+ 
+ '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
+ Build the javadoc with Maven:
+ {{{
+ mvn javadoc:javadoc
+ firefox hadoop-common/target/site/api/index.html
+ }}}
  
   Creating a patch 
  Check to see what files you have modified with:
@@ -137, +159 @@

  
  {{{
  ant \
-   -Dpatch.file=/patch/to/my.patch \
+   -Dpatch.file=/path/to/my.patch \
-Dforrest.home=/path/to/forrest/ \
-Dfindbugs.home=/path/to/findbugs \
-Dscratch.dir=/path/to/a/temp/dir \ (optional)
@@ -155, +177 @@

   * the {{{patch}}} command must support the -E flag
   * you may need to explicitly s

[Hadoop Wiki] Update of "HowToContribute" by TomWhite

2011-08-02 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=55&rev2=56

Comment:
Post HADOOP-6671 changes

* Place your class in the {{{src/test}}} tree.
* {{{TestFileSystem.java}}} and {{{TestMapRed.java}}} are examples of 
standalone MapReduce-based tests.
* {{{TestPath.java}}} is an example of a non MapReduce-based test.
+   * You can run all the Common unit tests with {{{mvn test}}}, or a specific 
unit test with {{{mvn -Dtest= test}}}. Run 
these commands from the {{{hadoop-trunk}}} directory.
-   * You can run all the unit tests with the command {{{ant test}}}, or you 
can run a specific unit test with the command {{{ant -Dtestcase= test}}} (for example {{{ant -Dtestcase=TestFileSystem 
test}}})
+   * For HDFS and MapReduce, you can run all the unit tests with the command 
{{{ant test}}}, or you can run a specific unit test with the command {{{ant 
-Dtestcase= test}}} (for example {{{ant 
-Dtestcase=TestFileSystem test}}})
-* '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]''' You can run all the Common unit tests with {{{mvn test}}}, or a 
specific unit test with {{{mvn -Dtest= 
test}}}.
  
   Using Ant 
- Hadoop is built by Ant, a Java building tool.  This section will eventually 
describe how Ant is used within Hadoop.  To start, simply read a good Ant 
tutorial.  The following is a good tutorial, though keep in mind that Hadoop 
isn't structured according to the ways outlined in the tutorial.  Use the 
tutorial to get a basic understand of Ant but not to understand how Ant is used 
for Hadoop:
+ Hadoop HDFS and MapReduce are built by Ant, a Java building tool. (Common is 
build using Maven, see below.) This section will eventually describe how Ant is 
used within Hadoop.  To start, simply read a good Ant tutorial.  The following 
is a good tutorial, though keep in mind that Hadoop isn't structured according 
to the ways outlined in the tutorial.  Use the tutorial to get a basic 
understand of Ant but not to understand how Ant is used for Hadoop:
  
   * Good Ant tutorial: http://i-proving.ca/space/Technologies/Ant+Tutorial
  
@@ -61, +61 @@

  }}}
  
   Using Maven 
- '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
  Hadoop Common is built using Maven. You need to use version 3 or later.
  
  === Generating a patch ===
@@ -86, +85 @@

  
  Unit tests development guidelines HowToDevelopUnitTests
  
- '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
  For building Hadoop Common with Maven, use the following to run all unit 
tests and build a distribution. The {{{-Ptest-patch}}} profile will check that 
no new compiler warnings have been introduced by your patch.
  
  {{{
@@ -104, +102 @@

  }}}
  Examine all public classes you've changed to see that documentation is 
complete, informative, and properly formatted.  Your patch must not generate 
any javadoc warnings.
  
+ For Common, use Maven to build the javadoc as follows:
- '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
- Build the javadoc with Maven:
  {{{
  mvn javadoc:javadoc
  firefox hadoop-common/target/site/api/index.html
@@ -177, +174 @@

   * the {{{patch}}} command must support the -E flag
   * you may need to explicitly set ANT_HOME.  Running {{{ant -diagnostics}}} 
will tell you the default value on your system.
  
- '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''
  For testing a patch in Hadoop Common, use a command like this one, run from 
the top-level ({{{hadoop-trunk}}}) checkout:
  {{{
  dev-support/test-patch.sh DEVELOPER \
@@ -208, +204 @@

  You may find that you need to modify both the common project and MapReduce or 
HDFS. Or perhaps you have changed something in common, and need to verify that 
these changes do not break the existing unit tests for HDFS and MapReduce. 
Hadoop's build system integrates with a local maven repository to support 
cross-project development. Use this general workflow for your development:
  
   * Make your changes in common
-  * Run any unit tests there (e.g. 'ant test')
+  * Run any unit tests there (e.g. 'mvn test')
   * ''Publish'' your new common jar to your local mvn repository:<>
   {{{
- common$ ant clean jar mvn-install
+ hadoop-common$ mvn clean install
  }}}
   . A word of caution: `mvn-install` pushes the artifacts into your local 
Maven repository which is shared by all your projects.
-  . '''[The following applies once 
[[https://issues.apache.org/jira/browse/HADOOP-6671|HADOOP-6671]] is 
committed]'''<>
-  {{{
- hadoop-common$ mvn clean install
- }}}

[Hadoop Wiki] Update of "HowToContribute" by TomWhite

2011-08-02 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=56&rev2=57

  
  For testing a patch in Hadoop Common, use a command like this one, run from 
the top-level ({{{hadoop-trunk}}}) checkout:
  {{{
+ export MAVEN_HOME=...
  dev-support/test-patch.sh DEVELOPER \
/path/to/my.patch \
/tmp \
@@ -207, +208 @@

   * Run any unit tests there (e.g. 'mvn test')
   * ''Publish'' your new common jar to your local mvn repository:<>
   {{{
- hadoop-common$ mvn clean install
+ hadoop-common$ mvn clean install -DskipTests
  }}}
   . A word of caution: `mvn-install` pushes the artifacts into your local 
Maven repository which is shared by all your projects.
   * Switch to the dependent project and make any changes there (e.g., that 
rely on a new API you introduced in common).


[Hadoop Wiki] Update of "HowToContribute" by TomWhite

2011-08-09 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=57&rev2=58

Comment:
Updated to simplify test-patch.sh command following HADOOP-7525

   * the {{{patch}}} command must support the -E flag
   * you may need to explicitly set ANT_HOME.  Running {{{ant -diagnostics}}} 
will tell you the default value on your system.
  
- For testing a patch in Hadoop Common, use a command like this one, run from 
the top-level ({{{hadoop-trunk}}}) checkout:
+ For testing a patch in Hadoop Common, use the following command, run from the 
top-level ({{{hadoop-trunk}}}) checkout:
  {{{
+ dev-support/test-patch.sh /path/to/my.patch
- export MAVEN_HOME=...
- dev-support/test-patch.sh DEVELOPER \
-   /path/to/my.patch \
-   /tmp \
-   svn \
-   grep \
-   patch \
-   $FINDBUGS_HOME \
-   $FORREST_HOME \
-   `pwd`
  }}}
+ 
+ Run the same command with no arguments to see the usage options.
  
   Applying a patch 
  To apply a patch either you generated or found from JIRA, you can issue


[Hadoop Wiki] Update of "HowToContribute" by TomWhite

2011-08-22 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HowToContribute" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=58&rev2=59

Comment:
Updated with latest Maven changes

* You can run all the Common unit tests with {{{mvn test}}}, or a specific 
unit test with {{{mvn -Dtest= test}}}. Run 
these commands from the {{{hadoop-trunk}}} directory.
* For HDFS and MapReduce, you can run all the unit tests with the command 
{{{ant test}}}, or you can run a specific unit test with the command {{{ant 
-Dtestcase= test}}} (for example {{{ant 
-Dtestcase=TestFileSystem test}}})
  
-  Using Ant 
- Hadoop HDFS and MapReduce are built by Ant, a Java building tool. (Common is 
build using Maven, see below.) This section will eventually describe how Ant is 
used within Hadoop.  To start, simply read a good Ant tutorial.  The following 
is a good tutorial, though keep in mind that Hadoop isn't structured according 
to the ways outlined in the tutorial.  Use the tutorial to get a basic 
understand of Ant but not to understand how Ant is used for Hadoop:
- 
-  * Good Ant tutorial: http://i-proving.ca/space/Technologies/Ant+Tutorial
- 
- Although most Java IDEs ship with a version of Ant, having a command line 
version installed is invaluable. You can download a version from 
http://ant.apache.org/.
- 
- After installing Ant, you must make sure that its networking support is 
configured for any proxy you have. Without that the build will not work, as the 
Hadoop builds will not be able to download their dependencies using 
[[http://ant.apache.org/ivy/|Ivy]].
- 
- Tip: to see how Ant is set up, run
- 
- {{{
- ant -diagnostics
- }}}
- 
   Using Maven 
- Hadoop Common is built using Maven. You need to use version 3 or later.
+ Hadoop built using [[http://maven.apache.org/|Apache Maven]], version 3 or 
later. (Parts of MapReduce are still built using Ant, see the instructions in 
the {{{INSTALL}}} file in {{{hadoop-mapreduce}}} for details.)
  
  === Generating a patch ===
   Unit Tests 
  Please make sure that all unit tests succeed before constructing your patch 
and that no new javac compiler warnings are introduced by your patch.
  
+ For building Hadoop with Maven, use the following to run all unit tests and 
build a distribution. The {{{-Ptest-patch}}} profile will check that no new 
compiler warnings have been introduced by your patch.
- {{{
- > cd hadoop-common-trunk
- > ant -Djavac.args="-Xlint -Xmaxwarns 1000" clean test tar
- }}}
- After a while, if you see
  
  {{{
- BUILD SUCCESSFUL
+ mvn clean install -Pdist -Dtar -Ptest-patch
  }}}
- all is ok, but if you see
  
+ Any test failures can be found in the {{{target/surefire-reports}}} directory 
of the relevant module. You can also run this command in one of the 
{{{hadoop-common}}}, {{{hadoop-hdfs}}}, or {{{hadoop-mapreduce}}} directories 
to just test a particular subproject.
- {{{
- BUILD FAILED
- }}}
- then please examine error messages in {{{build/test}}} and fix things before 
proceeding.
  
  Unit tests development guidelines HowToDevelopUnitTests
- 
- For building Hadoop Common with Maven, use the following to run all unit 
tests and build a distribution. The {{{-Ptest-patch}}} profile will check that 
no new compiler warnings have been introduced by your patch.
- 
- {{{
- mvn clean install -Ptar -Ptest-patch
- }}}
- 
- Any test failures can be found in {{{hadoop-common/target/surefire-reports}}}.
  
   Javadoc 
  Please also check the javadoc.
  
  {{{
- > ant javadoc
- > firefox build/docs/api/index.html
+ mvn javadoc:javadoc
+ firefox target/site/api/index.html
  }}}
  Examine all public classes you've changed to see that documentation is 
complete, informative, and properly formatted.  Your patch must not generate 
any javadoc warnings.
- 
- For Common, use Maven to build the javadoc as follows:
- {{{
- mvn javadoc:javadoc
- firefox hadoop-common/target/site/api/index.html
- }}}
  
   Creating a patch 
  Check to see what files you have modified with:
@@ -150, +113 @@

  This way other developers can preview your change by running the script and 
then applying the patch.
  
   Testing your patch 
- Before submitting your patch, you are encouraged to run the same tools that 
the automated Hudson patch test system will run on your patch.  This enables 
you to fix problems with your patch before you submit it.  The {{{test-patch}}} 
Ant target will run your patch through the same checks that Hudson currently 
does ''except'' for executing the core and contrib unit tests.
+ Before submitting your patch, you are encouraged to run the same tools that 
the automated Jenkins patch test system will run on your patch.  This enables 
you to fix problems with your patch before you submit it. The 
{{{dev-support/test-patch.sh}}} script in the trunk directory will run your 
patch through the same chec