[ https://issues.apache.org/jira/browse/SPARK-31231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-31231. ---------------------------------- Assignee: Hyukjin Kwon Resolution: Fixed > Support setuptools 46.1.0+ in PySpark packaging > ----------------------------------------------- > > Key: SPARK-31231 > URL: https://issues.apache.org/jira/browse/SPARK-31231 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.4.5, 3.0.0, 3.1.0 > Reporter: Hyukjin Kwon > Assignee: Hyukjin Kwon > Priority: Blocker > > PIP packaging test started to fail (see > https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/120218/testReport/) > as of setuptools 46.1.0 release. > In https://github.com/pypa/setuptools/issues/1424, they decided to don't keep > the modes in {{package_data}}. In PySpark pip installation, we keep the > executable scripts in {{package_data}} > https://github.com/apache/spark/blob/master/python/setup.py#L199-L200, and > expose their symbolic links as executable scripts. > So, the symbolic links (or copied scripts) executes the scripts copied from > {{package_data}}, which didn't keep the modes: > {code} > /tmp/tmp.UmkEGNFdKF/3.6/bin/spark-submit: line 27: > /tmp/tmp.UmkEGNFdKF/3.6/lib/python3.6/site-packages/pyspark/bin/spark-class: > Permission denied > /tmp/tmp.UmkEGNFdKF/3.6/bin/spark-submit: line 27: exec: > /tmp/tmp.UmkEGNFdKF/3.6/lib/python3.6/site-packages/pyspark/bin/spark-class: > cannot execute: Permission denied > {code} > The current issue is being tracked at > https://github.com/pypa/setuptools/issues/2041 -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org