Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23017 @vanzin @ifilonenko can you guys try and build the dist locally on your dev laptops? here's a little wrapper script to make it easier: you'll need to update your PATH to have some version of python3 installed (tho i don't actually think it's necessary), as well as JAVA_HOME... might also need zinc in there as well. ``` #!/bin/bash rm -f spark-*.tgz export DATE=`date "+%Y%m%d"` export REVISION=`git rev-parse --short HEAD` export AMPLAB_JENKINS=1 export PATH="$PATH:/home/anaconda/envs/py3k/bin" # Prepend JAVA_HOME/bin to fix issue where Zinc's embedded SBT incremental compiler seems to # ignore our JAVA_HOME and use the system javac instead. export PATH="$JAVA_HOME/bin:$PATH:/usr/local/bin" # Generate random point for Zinc export ZINC_PORT ZINC_PORT=$(python -S -c "import random; print random.randrange(3030,4030)") export SBT_OPTS="-Duser.home=$HOME -Dsbt.ivy.home=$HOME/.ivy2" export SPARK_VERSIONS_SUITE_IVY_PATH="$HOME/.ivy2" ./dev/make-distribution.sh --name ${DATE}-${REVISION} --pip --tgz -DzincPort=${ZINC_PORT} \ -Phadoop-2.7 -Pkubernetes -Pkinesis-asl -Phive -Phive-thriftserver retcode=$? exit $retcode ```
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org