Re: sbt/sbt run command returns a JVM problem
thanks very much, seems working... -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p14870.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
RE: sbt/sbt run command returns a JVM problem
hi I still have over 1g left for my program. Date: Sun, 4 May 2014 19:14:30 -0700 From: ml-node+s1001560n5340...@n3.nabble.com To: gyz...@hotmail.com Subject: Re: sbt/sbt run command returns a JVM problem the total memory of your machine is 2G right?then how much memory is left free? wouldn`t ubuntu take up quite a big portion of 2G? just a guess! On Sat, May 3, 2014 at 8:15 PM, Carter [hidden email] wrote: Hi, thanks for all your help. I tried your setting in the sbt file, but the problem is still there. The Java setting in my sbt file is: java \ -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m \ -jar ${JAR} \ $@ I have tried to set these 3 parameters bigger and smaller, but nothing works. Did I change the right thing? Thank you very much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5267.html Sent from the Apache Spark User List mailing list archive at Nabble.com. If you reply to this email, your message will be added to the discussion below: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5340.html To unsubscribe from sbt/sbt run command returns a JVM problem, click here. NAML -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5412.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: sbt/sbt run command returns a JVM problem
Hi Michael, The log after I typed last is as below: last scala.tools.nsc.MissingRequirementError: object scala not found. at scala.tools.nsc.symtab.Definitions$definitions$.getModuleOrClass(Definitions.scala:655) at scala.tools.nsc.symtab.Definitions$definitions$.getModule(Definitions.scala:605) at scala.tools.nsc.symtab.Definitions$definitions$.ScalaPackage(Definitions.scala:145) at scala.tools.nsc.symtab.Definitions$definitions$.ScalaPackageClass(Definitions.scala:146) at scala.tools.nsc.symtab.Definitions$definitions$.AnyClass(Definitions.scala:176) at scala.tools.nsc.symtab.Definitions$definitions$.init(Definitions.scala:814) at scala.tools.nsc.Global$Run.init(Global.scala:697) at sbt.compiler.Eval$$anon$1.init(Eval.scala:53) at sbt.compiler.Eval.run$1(Eval.scala:53) at sbt.compiler.Eval.unlinkAll$1(Eval.scala:56) at sbt.compiler.Eval.eval(Eval.scala:62) at sbt.EvaluateConfigurations$.evaluateSetting(Build.scala:104) at sbt.BuiltinCommands$$anonfun$set$1.apply(Main.scala:212) at sbt.BuiltinCommands$$anonfun$set$1.apply(Main.scala:209) at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:60) at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:60) at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62) at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:62) at sbt.Command$.process(Command.scala:90) at sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(MainLoop.scala:71) at sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(MainLoop.scala:71) at sbt.State$$anon$2.process(State.scala:171) at sbt.MainLoop$$anonfun$next$1.apply(MainLoop.scala:71) at sbt.MainLoop$$anonfun$next$1.apply(MainLoop.scala:71) at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18) at sbt.MainLoop$.next(MainLoop.scala:71) at sbt.MainLoop$.run(MainLoop.scala:64) at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:53) at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:50) at sbt.Using.apply(Using.scala:25) at sbt.MainLoop$.runWithNewLog(MainLoop.scala:50) at sbt.MainLoop$.runAndClearLast(MainLoop.scala:33) at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:17) at sbt.MainLoop$.runLogged(MainLoop.scala:13) at sbt.xMain.run(Main.scala:26) at xsbt.boot.Launch$.run(Launch.scala:55) at xsbt.boot.Launch$$anonfun$explicit$1.apply(Launch.scala:45) at xsbt.boot.Launch$.launch(Launch.scala:60) at xsbt.boot.Launch$.apply(Launch.scala:16) at xsbt.boot.Boot$.runImpl(Boot.scala:31) at xsbt.boot.Boot$.main(Boot.scala:20) at xsbt.boot.Boot.main(Boot.scala) [error] scala.tools.nsc.MissingRequirementError: object scala not found. [error] Use 'last' for the full log. And my sbt file is like below (my sbt launcher is sbt-launch-0.12.4.jar in the same folder): #!/bin/bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the License); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # #http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an AS IS BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # This script launches sbt for this project. If present it uses the system # version of sbt. If there is no system version of sbt it attempts to download # sbt locally. SBT_VERSION=`awk -F = '/sbt\\.version/ {print $2}' ./project/build.properties` URL1=http://typesafe.artifactoryonline.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/${SBT_VERSION}/sbt-launch.jar URL2=http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/${SBT_VERSION}/sbt-launch.jar JAR=sbt/sbt-launch-${SBT_VERSION}.jar # Download sbt launch jar if it hasn't been downloaded yet if [ ! -f ${JAR} ]; then # Download printf Attempting to fetch sbt\n if hash curl 2/dev/null; then curl --progress-bar ${URL1} ${JAR} || curl --progress-bar ${URL2} ${JAR} elif hash wget 2/dev/null; then wget --progress=bar ${URL1} -O ${JAR} || wget --progress=bar ${URL2} -O ${JAR} else printf You do not have curl or wget installed, please install sbt manually from http://www.scala-sbt.org/\n; exit -1 fi fi if [ ! -f ${JAR} ]; then # We failed to download printf Our attempt to
Re: sbt/sbt run command returns a JVM problem
the total memory of your machine is 2G right? then how much memory is left free? wouldn`t ubuntu take up quite a big portion of 2G? just a guess! On Sat, May 3, 2014 at 8:15 PM, Carter gyz...@hotmail.com wrote: Hi, thanks for all your help. I tried your setting in the sbt file, but the problem is still there. The Java setting in my sbt file is: java \ -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m \ -jar ${JAR} \ $@ I have tried to set these 3 parameters bigger and smaller, but nothing works. Did I change the right thing? Thank you very much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5267.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: sbt/sbt run command returns a JVM problem
Hi, thanks for all your help. I tried your setting in the sbt file, but the problem is still there. The Java setting in my sbt file is: java \ -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m \ -jar ${JAR} \ $@ I have tried to set these 3 parameters bigger and smaller, but nothing works. Did I change the right thing? Thank you very much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5267.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: sbt/sbt run command returns a JVM problem
The problem is probably not with the JVM running sbt but with the one that sbt is forking to run your program. See here for the relevant option: https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L186 You might try starting sbt with no arguments (to bring up the sbt console). You can then set javaOptions += -Xmx1G and afterwards try run. Michael On Sat, May 3, 2014 at 5:15 AM, Carter gyz...@hotmail.com wrote: Hi, thanks for all your help. I tried your setting in the sbt file, but the problem is still there. The Java setting in my sbt file is: java \ -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=256m \ -jar ${JAR} \ $@ I have tried to set these 3 parameters bigger and smaller, but nothing works. Did I change the right thing? Thank you very much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5267.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: sbt/sbt run command returns a JVM problem
Hi Michael, Thank you very much for your reply. Sorry I am not very familiar with sbt. Could you tell me where to set the Java option for the sbt fork for my program? I brought up the sbt console, and run set javaOptions += -Xmx1G in it, but it returned an error: [error] scala.tools.nsc.MissingRequirementError: object scala not found. [error] Use 'last' for the full log. Is this the right way to set the java option? Thank you very much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157p5294.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: sbt/sbt run command returns a JVM problem
Here's how I configure SBT, which I think is the usual way: export SBT_OPTS=-XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m -Xmx1g See if that takes. But your error is that you're already asking for too much memory for your machine. So maybe you are setting the value successfully, but it's not valid. How big? On Thu, May 1, 2014 at 2:57 PM, Chester Chen chesterxgc...@yahoo.com wrote: You might want to check the memory settings in sbt itself, which its a shell scripts run a java command. I don't have computer at hand, but if you vim or cat the sbt/sbt , you might see the memory settings , you change it to fit your need You might also can overwrite the setting by change .sbtopts without change the script , but google it for sure. Chester Sent from my iPhone On May 1, 2014, at 6:47 AM, Carter gyz...@hotmail.com wrote: Hi, I have a very simple spark program written in Scala: /*** testApp.scala ***/ object testApp { def main(args: Array[String]) { println(Hello! World!) } } Then I use the following command to compile it: $ sbt/sbt package The compilation finished successfully and I got a JAR file. But when I use this command to run it: $ sbt/sbt run it returned an error with JVM: [info] Error occurred during initialization of VM [info] Could not reserve enough space for object heap [error] Error: Could not create the Java Virtual Machine. [error] Error: A fatal exception has occurred. Program will exit. java.lang.RuntimeException: Nonzero exit code returned from runner: 1 at scala.sys.package$.error(package.scala:27) My machine has 2G memory, and runs on Ubuntu 11.04. I also tried to change the setting of java parameter (e.g., -Xmx, -Xms, -XX:MaxPermSize, -XX:ReservedCodeCacheSize) in the file sbt/sbt, but it looks like non of the change works. Can anyone help me out with this problem? Thank you very much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157.html Sent from the Apache Spark User List mailing list archive at Nabble.com.