Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.
The "Hive/HowToContribute" page has been changed by JohnSichi. http://wiki.apache.org/hadoop/Hive/HowToContribute?action=diff&rev1=19&rev2=20 -------------------------------------------------- The Hive build downloads a number of different Hadoop versions via ivy in order to compile "shims" which allow for compatibility with these Hadoop versions. However, by default, the rest of Hive is only built and tested against a single Hadoop version (0.20.0 as of this writing, but check build.properties for the latest). - You can specify a different Hadoop version with -Dhadoop.version="<your-hadoop-version>". By default, Hadoop tarballs are pulled from http://mirror.facebook.net/facebook/hive-deps, which contains Hadoop 0.17.2.1, 0.18.3, 0.19.0, and 0.20.0. If the version you want is not here, then you'll need to set hadoop.mirror to a different source. For 0.19.2 and 0.20.1, you can use http://mirror.facebook.net/apache or any other Apache mirror. For other versions, you'll need to use http://archive.apache.org/dist (but don't use this unless you have to, since it's an overloaded server). + You can specify a different Hadoop version with -Dhadoop.version="<your-hadoop-version>". By default, Hadoop tarballs are pulled from http://mirror.facebook.net/facebook/hive-deps, which contains Hadoop 0.17.2.1, 0.18.3, 0.19.0, and 0.20.0. If the version you want is not here, then you'll need to set hadoop.mirror to a different source. For 0.19.2 and 0.20.2, you can use http://mirror.facebook.net/apache or any other Apache mirror. For other versions, you'll need to use http://archive.apache.org/dist (but don't use this unless you have to, since it's an overloaded server). === Unit Tests === Please make sure that all unit tests succeed before and after applying your patch and that no new javac compiler warnings are introduced by your patch. Also see the information in the previous section about testing with different Hadoop versions if you want to verify compatibility with something other than the default Hadoop version.