Hi,

Spark Thrift server (STS) still uses hive thrift server. If you look at
$SPARK_HOME/sbin/start-thriftserver.sh you will see (mine is Spark 2)

function usage {
  echo "Usage: ./sbin/start-thriftserver [options] [thrift server options]"
  pattern="usage"
  *pattern+="\|Spark assembly has been built with Hive"*
  pattern+="\|NOTE: SPARK_PREPEND_CLASSES is set"
  pattern+="\|Spark Command: "
  pattern+="\|======="
  pattern+="\|--help"


Indeed when you start STS, you pass hiveconf parameter to it

${SPARK_HOME}/sbin/start-thriftserver.sh \
                --master  \
                --hiveconf hive.server2.thrift.port=10055 \

and STS bypasses Spark optimiser and uses Hive optimizer and execution
engine. You will see this in hive.log file

So I don't think it is going to give you much difference. Unless they have
recently changed the design of STS.

HTH




Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 13 September 2016 at 22:32, Benjamin Kim <bbuil...@gmail.com> wrote:

> Does anyone have any thoughts about using Spark SQL Thriftserver in Spark
> 1.6.2 instead of HiveServer2? We are considering abandoning HiveServer2 for
> it. Some advice and gotcha’s would be nice to know.
>
> Thanks,
> Ben
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to