Re: spark 1.6 Issue
Hi All, worked OK by adding below in VM options. -Xms128m -Xmx512m -XX:MaxPermSize=300m -ea Thanks Sri -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893p25920.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
spark 1.6 Issue
Hi All, I am running my app in IntelliJ Idea (locally) my config local[*] , the code worked ok with spark 1.5 but when I upgraded to 1.6 I am having below issue. is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error do I need to pass executor memory while running in local in spark 1.6 ? Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 4.718592E8. Please use a larger heap size. Thanks Sri -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: spark 1.6 Issue
It's not a bug, but a larger heap is required with the new UnifiedMemoryManager: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L172 On Wed, Jan 6, 2016 at 6:35 AM, kali.tumm...@gmail.com < kali.tumm...@gmail.com> wrote: > Hi All, > > I am running my app in IntelliJ Idea (locally) my config local[*] , the > code > worked ok with spark 1.5 but when I upgraded to 1.6 I am having below > issue. > > is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error > do I need to pass executor memory while running in local in spark 1.6 ? > > Exception in thread "main" java.lang.IllegalArgumentException: System > memory > 259522560 must be at least 4.718592E8. Please use a larger heap size. > > Thanks > Sri > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Re: spark 1.6 Issue
Hi Mark, I did changes to VM options in edit configuration section for the main method and Scala test case class in IntelliJ which worked ok when I executed individually, but while running maven install to create jar file the test case is failing. Can I add VM options in spark conf set in Scala test class hard coded way? Thanks Sri Sent from my iPhone > On 6 Jan 2016, at 17:43, Mark Hamstrawrote: > > It's not a bug, but a larger heap is required with the new > UnifiedMemoryManager: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L172 > >> On Wed, Jan 6, 2016 at 6:35 AM, kali.tumm...@gmail.com >> wrote: >> Hi All, >> >> I am running my app in IntelliJ Idea (locally) my config local[*] , the code >> worked ok with spark 1.5 but when I upgraded to 1.6 I am having below issue. >> >> is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error >> do I need to pass executor memory while running in local in spark 1.6 ? >> >> Exception in thread "main" java.lang.IllegalArgumentException: System memory >> 259522560 must be at least 4.718592E8. Please use a larger heap size. >> >> Thanks >> Sri >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >