[ https://issues.apache.org/jira/browse/SPARK-27688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16892580#comment-16892580 ]
Yuming Wang commented on SPARK-27688: ------------------------------------- {code:sh} build/sbt clean package -Phive -Phive-thriftserver -Phadoop-3.2 export SPARK_PREPEND_CLASSES=true sbin/stop-thriftserver.sh {code} {noformat} [root@spark-3267648 apache-spark]# bin/beeline -u jdbc:hive2://localhost:10000/default --showDbInPrompt NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes ahead of assembly. log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Connecting to jdbc:hive2://localhost:10000/default Connected to: Spark SQL (version 3.0.0-SNAPSHOT) Driver: Hive JDBC (version 2.3.5) Transaction isolation: TRANSACTION_REPEATABLE_READ Beeline version 2.3.5 by Apache Hive 0: jdbc:hive2://localhost:10000/default (default)> use db2; +---------+ | Result | +---------+ +---------+ No rows selected (0.168 seconds) 0: jdbc:hive2://localhost:10000/default (db2)> 0: jdbc:hive2://localhost:10000/default (db2)> use db1; +---------+ | Result | +---------+ +---------+ No rows selected (0.091 seconds) 0: jdbc:hive2://localhost:10000/default (db1)> 0: jdbc:hive2://localhost:10000/default (db1)> {noformat} > Beeline should show database in the prompt > ------------------------------------------ > > Key: SPARK-27688 > URL: https://issues.apache.org/jira/browse/SPARK-27688 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 2.4.3 > Reporter: Sandeep Katta > Priority: Minor > > Since [HIVE-14123|https://issues.apache.org/jira/browse/HIVE-14123] supports > display of database in beeline. Spark should also support this -- This message was sent by Atlassian JIRA (v7.6.14#76016) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org