Thank you Takeshi it works fine now.

Regards,
Chanh


> On Jul 28, 2016, at 2:03 PM, Takeshi Yamamuro <linguin....@gmail.com> wrote:
> 
> Hi,
> 
> you need to set the value when you just start the server.
> 
> // maropu
> 
> On Thu, Jul 28, 2016 at 3:59 PM, Chanh Le <giaosu...@gmail.com 
> <mailto:giaosu...@gmail.com>> wrote:
> Hi everyone,
> 
> I set spark.sql.shuffle.partitions=10 after started Spark Thrift Server but I 
> seems not working.
> 
> ./sbin/start-thriftserver.sh --master 
> mesos://zk://master1:2181,master2:2181,master3:2181/mesos <> --conf 
> spark.driver.memory=5G --conf spark.scheduler.mode=FAIR --class 
> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --jars 
> /home/spark/spark-2.0.0-bin-hadoop2.6/jars/alluxio-core-client-spark-1.2.0-jar-with-dependencies.jar
>  --total-executor-cores 35 spark-internal --hiveconf 
> hive.server2.thrift.port=10000 --hiveconf 
> hive.metastore.warehouse.dir=/user/hive/warehouse --hiveconf 
> hive.metastore.metadb.dir=/user/hive/metadb
> 
> Did anyone has the same with me?
> <Screen Shot 2016-07-28 at 1.59.38 PM.png>
> <Screen Shot 2016-07-28 at 1.58.03 PM.png>
> 
> 
> 
> -- 
> ---
> Takeshi Yamamuro

Reply via email to