[ https://issues.apache.org/jira/browse/SPARK-7672?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-7672: ----------------------------- Assignee: Nishkam Ravi > Number format exception with spark.kryoserializer.buffer.mb > ----------------------------------------------------------- > > Key: SPARK-7672 > URL: https://issues.apache.org/jira/browse/SPARK-7672 > Project: Spark > Issue Type: Bug > Components: Spark Core > Reporter: Nishkam Ravi > Assignee: Nishkam Ravi > Priority: Critical > Fix For: 1.4.0 > > > With spark.kryoserializer.buffer.mb 1000 : > Exception in thread "main" java.lang.NumberFormatException: Size must be > specified as bytes (b), kibibytes (k), mebibytes (m), gibibytes (g), > tebibytes (t), or pebibytes(p). E.g. 50b, 100k, or 250m. > Fractional values are not supported. Input was: 1000000.0 > at > org.apache.spark.network.util.JavaUtils.parseByteString(JavaUtils.java:238) > at > org.apache.spark.network.util.JavaUtils.byteStringAsKb(JavaUtils.java:259) > at org.apache.spark.util.Utils$.byteStringAsKb(Utils.scala:1037) > at org.apache.spark.SparkConf.getSizeAsKb(SparkConf.scala:245) > at > org.apache.spark.serializer.KryoSerializer.<init>(KryoSerializer.scala:53) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:526) > at org.apache.spark.SparkEnv$.instantiateClass$1(SparkEnv.scala:269) > at > org.apache.spark.SparkEnv$.instantiateClassFromConf$1(SparkEnv.scala:280) > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:283) > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188) > at > org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org