SparkSQL OOM issue

2015-07-07 Thread shshann
Dear all, We've tried to use sparkSql to do some insert from A table to B table action where using the exact same SQL script, hive is able to finish it but Spark 1.3.1 would always end with OOM issue; we tried several configuration including: --executor-cores 2 --num-executors 300

Re: SparkSQL OOM issue

2015-07-07 Thread Xiaoyu Ma
Hi, Where did OOM happened? In Driver or executor? Sometimes SparkSQL Driver OOM on tables with large number partitions. If so, you might want to increase it in spark-defaults.conf spark.driver.memory Shawn On Jul 7, 2015, at 3:58 PM, shsh...@tsmc.com wrote: Dear all, We've tried to

Re: SparkSQL OOM issue

2015-07-07 Thread shshann
| | Subject| |Re: SparkSQL OOM issue