Github user squito commented on the pull request:

    https://github.com/apache/spark/pull/6935#issuecomment-115742256
  
    @steveloughran  @XuTingjun  sorry I finally took a closer look here.  So I 
tried running a manual test on this patch and 
https://github.com/apache/spark/pull/6545, and it seems to me that *neither* 
one solves the problem.  (Or perhaps I am misunderstanding the issue.)
    
    Here's the steps I took:
    
    1) for this patch, modify the refresh time to 1 second just for testing
    2) run the history server within an sbt session
    ```
    build/sbt
    project core
    re-start
    <choose option for HistoryServer>
    ```
    3) In another window, fire up a spark-shell with event logging enabled, and 
run a few jobs (and leave the shell open)
    ```
    bin/spark-shell --conf "spark.eventLog.enabled=true"
    val d = sc.parallelize(1 to 100)
    d.count()
    d.count()
    ```
    4) View the app UI in the history server at localhost:18080, and also view 
the apps own UI at localhost:4040
    5) go back to the shell, run another `d.count()` job
    6) take another look at the UI, both for the app itself (which updates) and 
via the history server (which does not update, regardless of how long I wait.)
    
    @XuTingjun it sounded like you had done some manual testing on 
https://github.com/apache/spark/pull/6545 and believe that it should work.  Is 
there something wrong that I'm doing with my test?
    
    Unless I'm testing the wrong thing, I think this demonstrates the need to 
write a unit test which basically does the same thing, as its hard to verify 
the complete correctness here.  I'm writing that test case now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to