Regarding Off-heap memory

2016-01-26 Thread Xiaoyu Ma
Hi all, I saw spark 1.6 has new off heap settings: spark.memory.offHeap.size The doc said we need to shrink on heap size accordingly. But on Yarn on-heap and yarn limit is set all together via spark.executor.memory (jvm opts for memory is not allowed according to doc), how can we set executor

Re: SparkSQL OOM issue

2015-07-07 Thread Xiaoyu Ma
Hi, Where did OOM happened? In Driver or executor? Sometimes SparkSQL Driver OOM on tables with large number partitions. If so, you might want to increase it in spark-defaults.conf spark.driver.memory Shawn On Jul 7, 2015, at 3:58 PM, shsh...@tsmc.com wrote: Dear all, We've tried to

Spark on Hadoop 2.5.2

2015-07-01 Thread Xiaoyu Ma
Hi guys, I was trying to deploy SparkSQL thrift server on Hadoop 2.5.2 with Kerberos / Hive .13. It seems I got problem as below when I tried to start thrift server. java.lang.NoSuchFieldError: SASL_PROPS at