Thomas Graves created SPARK-44134: ------------------------------------- Summary: Can't set resources (GPU/FPGA) to 0 when they are set to positive value in spark-defaults.conf Key: SPARK-44134 URL: https://issues.apache.org/jira/browse/SPARK-44134 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 3.2.0 Reporter: Thomas Graves
With resource aware scheduling, if you specify a default value in the spark-defaults.conf, a user can't override that to set it to 0. Meaning spark-defaults.conf has something like: {{spark.executor.resource.\{resourceName}.amount=1}} {{spark.task.resource.\{resourceName}.amount}} =1 {{}} If the user tries to override when submitting an application with {{{}spark.executor.resource.\{resourceName}.amount{}}}=0 and {{{}{}}}{{{}spark.task.resource.\{resourceName}.amount{}}}{{ =0, it gives the user an error:}} {{}} {code:java} 23/06/21 09:12:57 ERROR Main: Failed to initialize Spark session. org.apache.spark.SparkException: No executor resource configs were not specified for the following task configs: gpu at org.apache.spark.resource.ResourceProfile.calculateTasksAndLimitingResource(ResourceProfile.scala:206) at org.apache.spark.resource.ResourceProfile.$anonfun$limitingResource$1(ResourceProfile.scala:139) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.resource.ResourceProfile.limitingResource(ResourceProfile.scala:138) at org.apache.spark.resource.ResourceProfileManager.addResourceProfile(ResourceProfileManager.scala:95) at org.apache.spark.resource.ResourceProfileManager.<init>(ResourceProfileManager.scala:49) at org.apache.spark.SparkContext.<init>(SparkContext.scala:455) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953){code} This used to work, my guess is this may have gotten broken with the stage level scheduling feature. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org