Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18948#discussion_r133610439
  
    --- Diff: 
streaming/src/main/scala/org/apache/spark/streaming/StreamingContext.scala ---
    @@ -144,6 +144,13 @@ class StreamingContext private[streaming] (
         }
       }
     
    +  if (sc.conf.contains("spark.cores.max")) {
    +    val totalCores = sc.conf.getInt("spark.cores.max", 1)
    --- End diff --
    
    @jiangxb1987 "spark.cores.max" is per application configuration to limit 
the numbers of cores can be requested for this application, it is not a per 
executor limitation. 
     
    > The config spark.cores.max is used to limit the max number of cores that 
a single executor can require
    
    So still if we have 2 receivers in one streaming application, the minimum 
number should > 2, checking "1" here is still not feasible.
    
    Since receiver number can only be gotten in run-time, checking 
configuration will not be worked as expected.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to