Repository: spark
Updated Branches:
  refs/heads/branch-2.1 d887f7581 -> f719cccdc


[SPARK-19572][SPARKR] Allow to disable hive in sparkR shell

## What changes were proposed in this pull request?
SPARK-15236 do this for scala shell, this ticket is for sparkR shell. This is 
not only for sparkR itself, but can also benefit downstream project like livy 
which use shell.R for its interactive session. For now, livy has no control of 
whether enable hive or not.

## How was this patch tested?

Tested it manually, run `bin/sparkR --master local --conf 
spark.sql.catalogImplementation=in-memory` and verify hive is not enabled.

Author: Jeff Zhang <zjf...@apache.org>

Closes #16907 from zjffdu/SPARK-19572.

(cherry picked from commit 7315880568fd07d4dfb9f76d538f220e9d320c6f)
Signed-off-by: Felix Cheung <felixche...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f719cccd
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f719cccd
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f719cccd

Branch: refs/heads/branch-2.1
Commit: f719cccdc46247d7d86a99a1eb177522d4a657ae
Parents: d887f75
Author: Jeff Zhang <zjf...@apache.org>
Authored: Tue Feb 28 22:21:29 2017 -0800
Committer: Felix Cheung <felixche...@apache.org>
Committed: Tue Feb 28 22:21:58 2017 -0800

----------------------------------------------------------------------
 .../src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala   | 6 ++++--
 1 file changed, 4 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f719cccd/sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala
----------------------------------------------------------------------
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala
index e56c33e..a4c5bf7 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala
@@ -47,12 +47,14 @@ private[sql] object SQLUtils extends Logging {
       jsc: JavaSparkContext,
       sparkConfigMap: JMap[Object, Object],
       enableHiveSupport: Boolean): SparkSession = {
-    val spark = if (SparkSession.hiveClassesArePresent && enableHiveSupport) {
+    val spark = if (SparkSession.hiveClassesArePresent && enableHiveSupport
+        && jsc.sc.conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase == 
"hive") {
       
SparkSession.builder().sparkContext(withHiveExternalCatalog(jsc.sc)).getOrCreate()
     } else {
       if (enableHiveSupport) {
         logWarning("SparkR: enableHiveSupport is requested for SparkSession 
but " +
-          "Spark is not built with Hive; falling back to without Hive 
support.")
+          s"Spark is not built with Hive or ${CATALOG_IMPLEMENTATION.key} is 
not set to 'hive', " +
+          "falling back to without Hive support.")
       }
       SparkSession.builder().sparkContext(jsc.sc).getOrCreate()
     }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to