Github user suyanNone commented on the pull request:

    https://github.com/apache/spark/pull/3582#issuecomment-65891778
  
     Sorry for my poor comments and English.
    
    In all, 
    1. we do put one thread by one thread until there have 1 thread succeed.
    2. multiple doGetLocal threads and only 1 dropFromMemory thread will wait 1 
time whenever put is succeed or failed.  doGetLocal get failed, the return 
none. dropFromMemory get failed, return none.
    
    There are 3 places call info.waitForReady()
    1. doGetLocal
    2. dropFromMemory
    3. doPut
    
    and if there are many thread try to put the same block.
    for 1, do doGetLocal, I think just wait for one time(Wait1Condition, now 
renamed as OtherCondition), succeed or failed.
    for 2, actually it will never have the situation if we call dropFromMemory 
but the block is not ready. but in current code there are have a 
info.waitForReady method call in dropFromMemory, just for compatibility, let's 
wait only one time(Wait1Condition) for block put succeed or failed. and also 
think, if we found one thread do the dropFromMemory, we should cancel all put 
threads.
    for 3, do all put threads one by one untill there have a success or have a 
thread want drop it from memory as we described in 2. it may  can fails many 
times, so WaitNCondition(now named as PutCondition)
    
    All I want to do for WaitType(now I rename BlockWaitCondition), just reuse 
enum convenience to call method and have a variable can record number of thread 
wait for that block finish put. and Each Block object have its own wait count, 
so I use extends Enumration.
           
    
    
    
    
    
    
    
    
    
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to