Hi,
Where did OOM happened?
In Driver or executor?
Sometimes SparkSQL Driver OOM on tables with large number partitions.
If so, you might want to increase it in spark-defaults.conf
spark.driver.memory

Shawn



> On Jul 7, 2015, at 3:58 PM, shsh...@tsmc.com wrote:
> 
> 
> Dear all,
> 
> We've tried to use sparkSql to do some insert from A table to B table
> action where using the exact same SQL script,
> hive is able to finish it but Spark 1.3.1 would always end with OOM issue;
> we tried several configuration including:
> 
> --executor-cores 2
> --num-executors 300
> --executor-memory 7g
> sconf.set("spark.storage.memoryFraction", "0")
> 
> but none of them can change the result of error:
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> is there any other configuration we can make? Thanks!
> --------------------------------------------------------------------------- 
>                                                         TSMC PROPERTY       
> This email communication (and any attachments) is proprietary information   
> for the sole use of its                                                     
> intended recipient. Any unauthorized review, use or distribution by anyone  
> other than the intended                                                     
> recipient is strictly prohibited.  If you are not the intended recipient,   
> please notify the sender by                                                 
> replying to this email, and then delete this email and any copies of it     
> immediately. Thank you.                                                     
> --------------------------------------------------------------------------- 
> 
> 
> 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to