lirui-apache commented on a change in pull request #11328: [FLINK-16455][hive] 
Introduce flink-sql-connector-hive modules to provide hive uber jars
URL: https://github.com/apache/flink/pull/11328#discussion_r389533074
 
 

 ##########
 File path: docs/dev/table/hive/index.md
 ##########
 @@ -92,6 +92,34 @@ to make the integration work in Table API program or SQL in 
SQL Client.
 Alternatively, you can put these dependencies in a dedicated folder, and add 
them to classpath with the `-C`
 or `-l` option for Table API program or SQL Client respectively.
 
+Apache Hive is built on Hadoop, so you need Hadoop dependency first, please 
refer to
+[Providing Hadoop classes]({{ site.baseurl 
}}/ops/deployment/hadoop.html#providing-hadoop-classes).
+
+There are two way to hive dependencies. First way, using bundled hive jar, 
providing multiple bundle
+hive jars that can cover all remote metastore versions. Second way, user 
defined dependencies, you
+can build your own dependencies if you need to. 
+
+#### Using bundled hive jar
+
+The following tables list all available bundled hive jars. You can pick one to 
the `/lib/` directory in Flink distribution.
+
+{% if site.is_stable %}
+
+| Remote Metastore   | Dependency Version  | Maven dependency             | 
SQL Client JAR         |
 
 Review comment:
   I don't think the `Dependency Version` column is useful. We just need to let 
users know which jar to download (or which mvn dependency to use), given the 
version of their HMS.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to