Github user uce commented on a diff in the pull request:

    https://github.com/apache/incubator-flink/pull/232#discussion_r20928070
  
    --- Diff: docs/building.md ---
    @@ -39,45 +41,48 @@ There are two main versions of Hadoop that we need to 
differentiate:
     - Hadoop 2, with all versions starting with 2, like 2.2.0.
     The main differentiation between Hadoop 1 and Hadoop 2 is the availability 
of Hadoop YARN (Hadoops cluster resource manager).
     
    -**To build Flink for Hadoop 2**, issue the following command:
    +By default, Flink is using the Hadoop 2 dependencies.
    +
    +**To build Flink for Hadoop 1**, issue the following command:
     
     ~~~bash
    -mvn clean package -DskipTests -Dhadoop.profile=2
    +mvn clean install -DskipTests -Dhadoop.profile=1
     ~~~
     
    -The `-Dhadoop.profile=2` flag instructs Maven to build Flink with YARN 
support and the Hadoop 2 HDFS client.
    +The `-Dhadoop.profile=1` flag instructs Maven to build Flink for Hadoop 1. 
Note that the features included in Flink change when using a different Hadoop 
profile. In particular the support for YARN and the build-in HBase support are 
not available in Hadoop 1 builds.
     
    -Usually, this flag is sufficient for full support of Flink for Hadoop 
2-versions.
    -However, you can also **specify a specific Hadoop version to build 
against**:
    +
    +You can also **specify a specific Hadoop version to build against**:
     
     ~~~bash
    -mvn clean package -DskipTests -Dhadoop.profile=2 -Dhadoop.version=2.4.1
    +mvn clean install -DskipTests -Dhadoop.version=2.4.1
     ~~~
     
     
     **To build Flink against a vendor specific Hadoop version**, issue the 
following command:
     
     ~~~bash
    -mvn clean package -DskipTests -Pvendor-repos -Dhadoop.profile=2 
-Dhadoop.version=2.2.0-cdh5.0.0-beta-2
    +mvn clean install -DskipTests -Pvendor-repos 
-Dhadoop.version=2.2.0-cdh5.0.0-beta-2
     ~~~
     
     The `-Pvendor-repos` activates a Maven [build 
profile](http://maven.apache.org/guides/introduction/introduction-to-profiles.html)
 that includes the repositories of popular Hadoop vendors such as Cloudera, 
Hortonworks, or MapR.
     
     **Build Flink for `hadoop2` versions before 2.2.0**
     
    -Maven will automatically build Flink with its YARN client if the 
`-Dhadoop.profile=2` is set. But there were some changes in Hadoop versions 
before the 2.2.0 Hadoop release that are not supported by Flink's YARN client. 
Therefore, you can disable building the YARN client with the following string: 
`-P\!include-yarn`. 
    +Maven will automatically build Flink with its YARN client. But there were 
some changes in Hadoop versions before the 2.2.0 Hadoop release that are not 
supported by Flink's YARN client. Therefore, you can disable building the YARN 
client with the following string: `-P\!include-yarn`. 
     
     So if you are building Flink for Hadoop `2.0.0-alpha`, use the following 
command:
     
     ~~~bash
    --P\!include-yarn -Dhadoop.profile=2 -Dhadoop.version=2.0.0-alpha
    +-P\!include-yarn -Dhadoop.version=2.0.0-alpha
     ~~~
     
     ## Background
     
     The builds with Maven are controlled by 
[properties](http://maven.apache.org/pom.html#Properties) and <a 
href="http://maven.apache.org/guides/introduction/introduction-to-profiles.html";>build
 profiles</a>.
    -There are two profiles, one for hadoop1 and one for hadoop2. When the 
hadoop2 profile is enabled, the system will also build the YARN client.
    -The hadoop1 profile is used by default. To enable the hadoop2 profile, set 
`-Dhadoop.profile=2` when building.
    +There are two profiles, one for hadoop1 and one for hadoop2. When the 
hadoop2 profile is enabled (default), the system will also build the YARN 
client.
    --- End diff --
    
    Yeah. The profile `hadoop1` is for Hadoop 1 and `hadoop2` is for Hadoop 
NextGen.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to