stczwd commented on a change in pull request #28048: [SPARK-31142][PYSPARK]Remove useless conf set in pyspark context URL: https://github.com/apache/spark/pull/28048#discussion_r400585844
########## File path: python/pyspark/context.py ########## @@ -181,11 +181,6 @@ def _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, self.appName = self._conf.get("spark.app.name") self.sparkHome = self._conf.get("spark.home", None) - for (k, v) in self._conf.getAll(): Review comment: The mainly change is in yarn and k8s, not in spark, we should adapt to it. What I claim here is that, with this code, the environment configuration in JVM and python worker won't be same, and the configure passed into python worker is awful. You can try `--conf spark.executorEnv.LD_LIBRARY_PATH=$PYTHONHOME/lib/python2.7/site-packages:$ LD_LIBRARY_PATH`, and find what I mean. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org