AngersZhuuuu commented on a change in pull request #29881:
URL: https://github.com/apache/spark/pull/29881#discussion_r509938592



##########
File path: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
##########
@@ -80,20 +81,41 @@ private[spark] object HiveUtils extends Logging {
   val HIVE_METASTORE_JARS = buildStaticConf("spark.sql.hive.metastore.jars")
     .doc(s"""
       | Location of the jars that should be used to instantiate the 
HiveMetastoreClient.
-      | This property can be one of three options: "
+      | This property can be one of four options: "
       | 1. "builtin"
       |   Use Hive ${builtinHiveVersion}, which is bundled with the Spark 
assembly when
       |   <code>-Phive</code> is enabled. When this option is chosen,
       |   <code>spark.sql.hive.metastore.version</code> must be either
       |   <code>${builtinHiveVersion}</code> or not defined.
       | 2. "maven"
       |   Use Hive jars of specified version downloaded from Maven 
repositories.
-      | 3. A classpath in the standard format for both Hive and Hadoop.
+      | 3. "path"
+      |   Use Hive jars configured by `spark.sql.hive.metastore.jars.path`
+      |   in comma separated format. Support both local or remote paths, it 
should always
+      |   be fully qualified URL to indicate other file systems.

Review comment:
       > `it should always be fully qualified URL ...` it's not true now.
   
   Remove this now

##########
File path: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala
##########
@@ -80,20 +81,41 @@ private[spark] object HiveUtils extends Logging {
   val HIVE_METASTORE_JARS = buildStaticConf("spark.sql.hive.metastore.jars")
     .doc(s"""
       | Location of the jars that should be used to instantiate the 
HiveMetastoreClient.
-      | This property can be one of three options: "
+      | This property can be one of four options: "
       | 1. "builtin"
       |   Use Hive ${builtinHiveVersion}, which is bundled with the Spark 
assembly when
       |   <code>-Phive</code> is enabled. When this option is chosen,
       |   <code>spark.sql.hive.metastore.version</code> must be either
       |   <code>${builtinHiveVersion}</code> or not defined.
       | 2. "maven"
       |   Use Hive jars of specified version downloaded from Maven 
repositories.
-      | 3. A classpath in the standard format for both Hive and Hadoop.
+      | 3. "path"
+      |   Use Hive jars configured by `spark.sql.hive.metastore.jars.path`
+      |   in comma separated format. Support both local or remote paths, it 
should always
+      |   be fully qualified URL to indicate other file systems.
+      | 4. A classpath in the standard format for both Hive and Hadoop.
       """.stripMargin)
     .version("1.4.0")
     .stringConf
     .createWithDefault("builtin")
 
+  val HIVE_METASTORE_JARS_PATH = 
buildStaticConf("spark.sql.hive.metastore.jars.path")
+    .doc(s"Comma separated fully qualified URL of Hive jars, support both 
local and remote paths," +
+      s"Such as: " +
+      s" 1. file://path/to/jar/xxx.jar" +
+      s" 2. hdfs://nameservice/path/to/jar/xxx.jar" +
+      s" 3. /path/to/jar/ (path without URI scheme follow conf 
`fs.defaultFS`'s URI schema)" +
+      s" 4. [http/https/ftp]://path/to/jar/xxx.jar" +
+      s"Notice: `http/https/ftp` doesn't support wildcard, but for other URLs 
support" +
+      s" nested path wildcard, Such as: " +
+      s" 1. file://path/to/jar/*, file://path/to/jar/*/*" +
+      s" 2. hdfs://nameservice/path/to/jar/*, 
hdfs://nameservice/path/to/jar/*/*" +

Review comment:
       > ditto, `\n`
   
   Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to