Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15659#discussion_r86286268
  
    --- Diff: dev/run-pip-tests-2 ---
    @@ -0,0 +1,94 @@
    +#!/usr/bin/env bash
    +
    +#
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#    http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +#
    +
    +# Stop on error
    +set -e
    +# Set nullglob for when we are checking existence based on globs
    +shopt -s nullglob
    +
    +FWDIR="$(cd "`dirname $0`"/..; pwd)"
    +cd "$FWDIR"
    +# Some systems don't have pip or virtualenv - in those cases our tests 
won't work.
    +if ! hash virtualenv 2>/dev/null; then
    +  echo "Missing virtualenv skipping pip installability tests."
    +  exit 0
    +fi
    +if ! hash pip 2>/dev/null; then
    +  echo "Missing pip, skipping pip installability tests."
    +  exit 0
    +fi
    +
    +if [ -d ~/.cache/pip/wheels/ ]; then
    +  echo "Cleaning up pip wheel cache so we install the fresh package"
    +  rm -rf ~/.cache/pip/wheels/
    +fi
    +
    +# Figure out which Python execs we should test pip installation with
    +PYTHON_EXECS=()
    +if hash python 2>/dev/null; then
    +  # We do this since we are testing with virtualenv and the default 
virtual env python
    +  # is in /usr/bin/python
    +  PYTHON_EXECS+=('python')
    +fi
    +if hash python3 2>/dev/null; then
    +  PYTHON_EXECS+=('python3')
    +fi
    +
    +for python in $PYTHON_EXECS; do
    +  echo "Testing pip installation with python $python"
    +  # Create a temp directory for us to work in and save its name to a file 
for cleanup
    +  echo "Constucting virtual env for testing"
    +  mktemp -d > ./virtual_env_temp_dir
    +  VIRTUALENV_BASE=`cat ./virtual_env_temp_dir`
    +  echo "Using $VIRTUALENV_BASE for virtualenv"
    +  virtualenv --python=$python $VIRTUALENV_BASE
    +  source $VIRTUALENV_BASE/bin/activate
    +  # Upgrade pip
    +  pip install --upgrade pip
    +
    +  echo "Creating pip installable source dist"
    +  cd $FWDIR/python
    +  $python setup.py sdist
    +
    +
    +  echo "Installing dist into virtual env"
    +  cd dist
    +  # Verify that the dist directory only contains one thing to install
    +  sdists=(*.tar.gz)
    +  if [ ${#sdists[@]} -ne 1 ]; then
    +    echo "Unexpected number of targets found in dist directory - please 
cleanup existing sdists first."
    +    exit -1
    +  fi
    +  # Do the actual installation
    +  pip install --upgrade --force-reinstall *.tar.gz
    +
    +  cd /
    +
    +  echo "Run basic sanity check on pip installed version with spark-submit"
    +  spark-submit $FWDIR/dev/pip-sanity-check.py
    +  echo "Run basic sanity check with import based"
    +  python $FWDIR/dev/pip-sanity-check.py
    --- End diff --
    
    yeah. nvm. `virtualenv` will take care of that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to