Github user zzcclp commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1455#discussion_r149575324
  
    --- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/LoadTableCommand.scala
 ---
    @@ -84,6 +84,44 @@ case class LoadTableCommand(
     
         val carbonProperty: CarbonProperties = CarbonProperties.getInstance()
         carbonProperty.addProperty("zookeeper.enable.lock", "false")
    +
    +    val numCoresLoading =
    +      try {
    +        Integer.parseInt(CarbonProperties.getInstance()
    +            .getProperty(CarbonCommonConstants.NUM_CORES_LOADING,
    +                CarbonCommonConstants.NUM_CORES_MAX_VAL.toString()))
    +      } catch {
    +        case exc: NumberFormatException =>
    +          LOGGER.error("Configured value for property " + 
CarbonCommonConstants.NUM_CORES_LOADING
    +              + " is wrong. ")
    +          CarbonCommonConstants.NUM_CORES_MAX_VAL
    +      }
    +
    +    val newNumCoresLoading =
    +      if (sparkSession.sparkContext.conf.contains("spark.executor.cores")) 
{
    +        // If running on yarn,
    +        // get the minimum value of 'spark.executor.cores' and 
NUM_CORES_LOADING,
    +        // If user set the NUM_CORES_LOADING, it can't exceed the value of 
'spark.executor.cores';
    +        // If user doesn't set the NUM_CORES_LOADING, it will use the 
value of
    +        // 'spark.executor.cores', but the value can't exceed the value of 
NUM_CORES_MAX_VAL,
    +        // NUM_CORES_LOADING's default value is NUM_CORES_MAX_VAL;
    +        Math.min(
    +          sparkSession.sparkContext.conf.getInt("spark.executor.cores", 1),
    +          numCoresLoading
    +        )
    +      } else {
    +        // If running on local mode,
    +        // get the minimum value of NUM_CORES_DEFAULT_VAL and 
NUM_CORES_LOADING,
    +        Math.min(
    +          Integer.parseInt(CarbonCommonConstants.NUM_CORES_DEFAULT_VAL),
    +          numCoresLoading
    +        )
    +      }
    +
    +    // update the property with new value
    +    carbonProperty.addProperty(CarbonCommonConstants.NUM_CORES_LOADING,
    --- End diff --
    
    I find that the value of 'spark.task.cpus' can't support to be modified 
dynamically, it get the value from SparkConf once when init 


---

Reply via email to