Hi Shawn,

Thank alot that's actually the last parameter we overlooked!!
I'm able to run the same sql on spark now if I set the spark.driver.memoory
larger,
thanks again!!

--
Best Regards,
Felicia Shann
單師涵
+886-3-5636688 Ext. 7124300


|------------------------------------->
|            Xiaoyu Ma                |
|            <hzmaxiaoyu@corp.netease.|
|            com>                     |
|                                     |
|                                     |
|                                     |
|            2015/07/07 下午 04:03    |
|------------------------------------->
  
>--------------------------------------------------------------------------------------------------------------------------------------------------|
  |                                                                             
                                                                     |
  |                                                                             
                                                                     |
  |                                                                             
                                                                   To|
  |        shsh...@tsmc.com                                                     
                                                                     |
  |                                                                             
                                                                   cc|
  |        user@spark.apache.org, mike_s...@tsmc.com, linc...@tsmc.com          
                                                                     |
  |                                                                             
                                                              Subject|
  |        Re: SparkSQL OOM issue                                               
                                                                     |
  |                                                                             
                                                                     |
  |                                                                             
                                                                     |
  |                                                                             
                                                                     |
  |                                                                             
                                                                     |
  |                                                                             
                                                                     |
  
>--------------------------------------------------------------------------------------------------------------------------------------------------|




Hi,
Where did OOM happened?
In Driver or executor?
Sometimes SparkSQL Driver OOM on tables with large number partitions.
If so, you might want to increase it in spark-defaults.conf
spark.driver.memory

Shawn



> On Jul 7, 2015, at 3:58 PM, shsh...@tsmc.com wrote:
>
>
> Dear all,
>
> We've tried to use sparkSql to do some insert from A table to B table
> action where using the exact same SQL script,
> hive is able to finish it but Spark 1.3.1 would always end with OOM
issue;
> we tried several configuration including:
>
> --executor-cores 2
> --num-executors 300
> --executor-memory 7g
> sconf.set("spark.storage.memoryFraction", "0")
>
> but none of them can change the result of error:
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> is there any other configuration we can make? Thanks!
>
---------------------------------------------------------------------------

>                                                         TSMC PROPERTY

> This email communication (and any attachments) is proprietary information

> for the sole use of its

> intended recipient. Any unauthorized review, use or distribution by
anyone
> other than the intended

> recipient is strictly prohibited.  If you are not the intended recipient,

> please notify the sender by

> replying to this email, and then delete this email and any copies of it

> immediately. Thank you.

>
---------------------------------------------------------------------------

>
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>




 --------------------------------------------------------------------------- 
                                                         TSMC PROPERTY       
 This email communication (and any attachments) is proprietary information   
 for the sole use of its                                                     
 intended recipient. Any unauthorized review, use or distribution by anyone  
 other than the intended                                                     
 recipient is strictly prohibited.  If you are not the intended recipient,   
 please notify the sender by                                                 
 replying to this email, and then delete this email and any copies of it     
 immediately. Thank you.                                                     
 --------------------------------------------------------------------------- 

Reply via email to