[ 
https://issues.apache.org/jira/browse/SPARK-1879?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tathagata Das resolved SPARK-1879.
----------------------------------

    Resolution: Fixed

> Default PermGen size too small when using Hadoop2 and Hive
> ----------------------------------------------------------
>
>                 Key: SPARK-1879
>                 URL: https://issues.apache.org/jira/browse/SPARK-1879
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Matei Zaharia
>            Assignee: Matei Zaharia
>            Priority: Critical
>             Fix For: 1.0.0
>
>
> If you launch a spark-shell with Hadoop 2 and Hive on the classpath, and try 
> to use Hive therein, the PermGen quickly reaches 85 MB after a few commands, 
> at which point Java gives up and freezes. We should pass a MaxPermSize to 
> prevent this. Unfortunately passing this results in a warning on Java 8, but 
> that's still better than not passing it.
> I don't think this affects stuff other than the shell; it's just the 
> combination of Scala compiler + Hive + Hadoop 2 that pushes things over the 
> edge.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to