[ 
https://issues.apache.org/jira/browse/SPARK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16764803#comment-16764803
 ] 

shahid commented on SPARK-26760:
--------------------------------

Yes. Writing store too frequently maybe a costly operation. So, after a 
particular time interval only task info write in the store. So, for a running 
job, task info may not update immediately in the UI. 

 !Screenshot from 2019-02-11 15-09-09.png! 

Regarding, how many task actually running for a running job, we need to see the 
logs or console, as UI may show slightly more or less depends on the update to 
the store.



> [Spark Incorrect display in SPARK UI Executor Tab when number of cores is 4 
> and Active Task display as 5 in Executor Tab of SPARK UI]
> -------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26760
>                 URL: https://issues.apache.org/jira/browse/SPARK-26760
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.0
>         Environment: Spark 2.4
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Major
>         Attachments: SPARK-26760.png, Screenshot from 2019-02-11 15-09-09.png
>
>
> Steps:
>  # Launch Spark Shell 
>  # bin/spark-shell --master yarn  --conf spark.dynamicAllocation.enabled=true 
> --conf spark.dynamicAllocation.initialExecutors=3 --conf 
> spark.dynamicAllocation.minExecutors=1 --conf 
> spark.dynamicAllocation.executorIdleTimeout=60s --conf 
> spark.dynamicAllocation.maxExecutors=5
>  # Submit a Job sc.parallelize(1 to 10000,116000).count()
>  # Check the YARN UI Executor Tab for the RUNNING application
>  # UI display as Number of cores 4 and Active Tasks column shows as 5
> Expected:
> It Number of Active Tasks should be same as Number of Cores.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to