Jeff Zhang created SPARK-17125:
----------------------------------

             Summary: Allow to specify spark config using non-string type in 
SparkR
                 Key: SPARK-17125
                 URL: https://issues.apache.org/jira/browse/SPARK-17125
             Project: Spark
          Issue Type: Improvement
          Components: SparkR
    Affects Versions: 2.0.0
            Reporter: Jeff Zhang
            Priority: Minor


I try to specify spark conf spark.executor.instances as following in SparkR, 
but fails. Since list supports any kind of data type, it is natural for user to 
specify int type for configuration like spark.exeucotr.instances. 

{code}
sparkR.session(master="yarn-client", sparkConfig = 
list(spark.executor.instances=1))
{code}

{noformat}
Error in invokeJava(isStatic = TRUE, className, methodName, ...) : 
  java.lang.IllegalArgumentException: spark.executor.instances should be int, 
but was 1.0
        at 
org.apache.spark.internal.config.ConfigHelpers$.toNumber(ConfigBuilder.scala:31)
        at 
org.apache.spark.internal.config.ConfigBuilder$$anonfun$intConf$1.apply(ConfigBuilder.scala:178)
        at 
org.apache.spark.internal.config.ConfigBuilder$$anonfun$intConf$1.apply(ConfigBuilder.scala:178)
        at scala.Option.map(Option.scala:146)
        at 
org.apache.spark.internal.config.OptionalConfigEntry.readFrom(ConfigEntry.scala:150)
        at 
org.apache.spark.internal.config.OptionalConfigEntry.readFrom(ConfigEntry.scala:138)
        at org.apache.spark.SparkConf.get(SparkConf.scala:251)
        at 
org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:313)
        at 
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:54)
        at 
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:154)
        at org.apache.spark.SparkContext
{noformat}





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to