Repository: spark
Updated Branches:
  refs/heads/master 3b4f35f56 -> 5330c192b


[HOTFIX] Fix PySpark pip packaging tests by non-ascii compatible character

## What changes were proposed in this pull request?

PIP installation requires to package bin scripts together.

https://github.com/apache/spark/blob/master/python/setup.py#L71

The recent fix introduced non-ascii compatible (non-breackable space I guess) 
at 
https://github.com/apache/spark/commit/ec96d34e74148803190db8dcf9fda527eeea9255 
fix.

This is usually not the problem but looks Jenkins's default encoding is `ascii` 
and during copying the script, there looks implicit conversion between bytes 
and strings - where the default encoding is used

https://github.com/pypa/setuptools/blob/v40.4.3/setuptools/command/develop.py#L185-L189

## How was this patch tested?

Jenkins

Closes #22782 from HyukjinKwon/pip-failure-fix.

Authored-by: hyukjinkwon <gurwls...@apache.org>
Signed-off-by: hyukjinkwon <gurwls...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5330c192
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5330c192
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5330c192

Branch: refs/heads/master
Commit: 5330c192bd87eb18351e72e390baf29855d99b0a
Parents: 3b4f35f
Author: hyukjinkwon <gurwls...@apache.org>
Authored: Sun Oct 21 02:04:45 2018 +0800
Committer: hyukjinkwon <gurwls...@apache.org>
Committed: Sun Oct 21 02:04:45 2018 +0800

----------------------------------------------------------------------
 bin/docker-image-tool.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/5330c192/bin/docker-image-tool.sh
----------------------------------------------------------------------
diff --git a/bin/docker-image-tool.sh b/bin/docker-image-tool.sh
index 001590a..7256355 100755
--- a/bin/docker-image-tool.sh
+++ b/bin/docker-image-tool.sh
@@ -79,7 +79,7 @@ function build {
   fi
 
   # Verify that Spark has actually been built/is a runnable distribution
-  # i.e. the Spark JARs that the Docker files will place into the image are 
present
+  # i.e. the Spark JARs that the Docker files will place into the image are 
present
   local TOTAL_JARS=$(ls $JARS/spark-* | wc -l)
   TOTAL_JARS=$(( $TOTAL_JARS ))
   if [ "${TOTAL_JARS}" -eq 0 ]; then


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to