Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/22192
  
    > It seems like the line 
Thread.currentThread().setContextClassLoader(replClassLoader) is causing the 
pyspark failures
    
    What if you restore the original class loader after initializing the 
plugins? I was a little worried about this call, but was waiting for tests... 
so if it's causing problems, better to not change the way things work there.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to