This is an automated email from the ASF dual-hosted git repository.

zhangbutao pushed a commit to branch revert-24-patch-1
in repository https://gitbox.apache.org/repos/asf/hive-site.git

commit ca0be4b131334073ef35c6aabc6f9caf12bf7e66
Author: Butao Zhang <[email protected]>
AuthorDate: Wed Jan 8 21:56:11 2025 +0800

    Revert "HIVE-28683:Add doc for install hive4 with old version hadoop"
---
 .../docs/latest/manual-installation_283118363.md   | 94 ----------------------
 1 file changed, 94 deletions(-)

diff --git a/content/docs/latest/manual-installation_283118363.md 
b/content/docs/latest/manual-installation_283118363.md
index 5b19c1c..2250f1b 100644
--- a/content/docs/latest/manual-installation_283118363.md
+++ b/content/docs/latest/manual-installation_283118363.md
@@ -16,7 +16,6 @@ date: 2024-12-12
        + [Extra hadoop configurations to make everything working]({{< ref 
"#extra-hadoop-configurations-to-make-everything-working" >}})
        + [Installing Hive from a Tarball]({{< ref 
"#installing-hive-from-a-tarball" >}})
        + [Installing from Source Code]({{< ref "#installing-from-source-code" 
>}})
-    + [Installing with old version hadoop(greater than or equal 3.1.0)]({{< 
ref "#installing-with-old-version-hadoop(greater-than-or-equal-3.1.0)" >}})
        + [Next Steps]({{< ref "#next-steps" >}})
        + [Beeline CLI]({{< ref "#beeline-cli" >}})
        + [Hive Metastore]({{< ref "#hive-metastore" >}})
@@ -379,99 +378,6 @@ That directory should contain all the files necessary to 
run Hive. You can run i
 
 From now, you can follow the steps described in the section Installing Hive 
from a Tarball
 
-
-## Installing with old version hadoop(greater than or equal 3.1.0)
-
-Although we normally require hive4 to rely on a 
-hadoop 3.3.6+ cluster environment. 
-However, in practice, in an ON YARN environment,
-we can package all the hadoop related dependencies into 
-tez&hive so that they do not need to rely on the lib 
-of the original hadoop cluster environment at runtime. 
-In this way, we can run HIVE4 in a lower version of hadoop, 
-provided that the base APIs of the hadoop 3.x series are common to 
-each other.
-
-The steps are as follows:
-
-1.Compile TEZ to get tez.tar.gz which contains all hadoop related 
dependencies(not tez minimal tarball),
-run `mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true -Pdist -Paws 
-Pazure`.
-For more detail,see:`https://tez.apache.org/install.html`.
-After compiling to get tez.tar.gz, users should set the following properties 
in tez-site.xml:
-
-```xml
-    <property>
-        <name>tez.lib.uris</name><!--Example, replace with actual hdfs path-->
-        <value>/apps/apache-tez-0.10.4-bin.tar.gz</value>
-    </property>
-    <property>
-        <name>tez.lib.uris.classpath</name> <!--only use tez self lib,do not 
use any old version hadoop cluster's lib-->
-       <value>$PWD/tezlib/*,$PWD/tezlib/lib/*</value>
-    </property>
-    <property>
-        <name>tez.use.cluster.hadoop-libs</name><!--only use tez self lib,do 
not use any old version hadoop cluster's lib-->
-        <value>false</value>
-    </property>
-
-    <property>
-        <name>tez.am.launch.env</name><!--Example, replace with actual 
native-lib install path.Reuse old version hadoop cluster's native lib is ok.-->
-        <value>LD_LIBRARY_PATH=/usr/hadoop/3.1.0/hadoop/lib/native</value>
-        <description>Users can set up environment variables individually, 
including but not limited to: JAVA_HOME, LD_LIBRARY_PATH.</description>
-    </property>
-    
-    <property>
-        <name>tez.task.launch.env</name><!--Example, replace with actual 
native-lib install path.Reuse old version hadoop cluster's native lib is ok.-->
-        <value>LD_LIBRARY_PATH=/usr/hadoop/3.1.0/hadoop/lib/native</value>
-        <description>Users can set up environment variables individually, 
including but not limited to: JAVA_HOME, LD_LIBRARY_PATH.</description>
-    </property>
-```
-
-2.Upload tez to the specified HDFS path in `tez.lib.uris`.(Please remember, do 
not use the minimal tarball for installation.)
-
-```shell
-## DO not upload minimal tarball !!!
-[hadoop@hive opt]# hdfs dfs -put apache-tez-0.10.4-bin.tar.gz /apps/
-```
-
-3.Download the high version of the Hadoop package(Please ensure that the 
HADOOP version on which TEZ depends is the same as the HADOOP version you have 
downloaded.).Unzip HIVE, HADOOP, and TEZ all in the installation path.
-
-```shell
-## In this example, we have installed HIVE-4.0.1 and TEZ-0.10.4 on an Hadoop 
3.1.0 cluster.users should install HIVE,HADOOP and TEZ into actual directories.
-[hadoop@hive opt]# cd /opt
-[hadoop@hive opt]# ll
-drwxr-xr-x 11 hive hadoop      4096 Nov  7 13:59 apache-hive-4.0.1-bin
-drwxr-xr-x  3 hive hadoop      4096 Nov  7 13:59 apache-tez-0.10.4-bin
-drwxr-xr-x 10 hive hadoop      4096 Nov  7 13:59 hadoop-3.3.6
-lrwxrwxrwx  1 hive hadoop        30 Nov  7 13:59 hive-4.0.1 -> 
apache-hive-4.0.1-bin
-lrwxrwxrwx  1 hive hadoop        21 Nov  7 13:59 tez -> apache-tez-0.10.4-bin
-```
-
-edit `hive-env.sh`
-
-```shell
-# Set HADOOP_HOME to point to a specific hadoop install directory
-HADOOP_HOME=${HADOOP_HOME:-/opt/hadoop-3.3.6}
-export HIVE_HOME=${HIVE_HOME:-/opt/hive-4.0.1}
-export TEZ_HOME=/opt/tez
-```
-
-Copy old version hadoop conf into hadoop3.3.6+:
-
-```shell
-cp /usr/hadoop/3.1.0/hadoop/conf/*  /opt/hadoop3.3.6/conf/
-```
-
-Put `tez-site.xml` into hive conf dir:
-
-```shell
-mv tez-site.xml  /opt/hive-4.0.1/conf/
-```
-
-After completing the above steps, users should be able to start the HMS 
service and HS2 service normally, and submit TEZ computing tasks without any 
issues.
-
-Through the above steps, we can run Hive4+tez in any Hadoop3 environment. 
Users do not need to upgrade the cluster's original hive/hadoop/tez.
-
-
 ## Next Steps
 
 You can begin using Hive as soon as it is installed, it should be work on you 
computer. There are some extra information in the following sections.

Reply via email to