Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/21533
  
    Just take another look on this issue. I think the fix is just to make it 
work, but not make it work correctly.
    
    The fix here and the original code actually treats scheme "local" to 
"file", actually they're different in Spark.
    
    In Spark "local" scheme means resources are already on the driver/executor 
nodes, which means Spark doesn't need to ship resources from driver to 
executors via fileserver. But here it treats as "file" which will be shipped 
via fileserver to executors. This is semantically not correct.
    
    I think for "local" scheme, the fix should:
    
    1. Make it accessible both from driver and executors via `SparkFiles#get`. 
By copying resource to the folders.
    2. It should not be added into fileServer.  
    
    
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to