[ 
https://issues.apache.org/jira/browse/SPARK-22325?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

shane knapp updated SPARK-22325:
--------------------------------
    Description: 
in the riselab jenkins master config, the SPARK_TESTING environment variable is 
set to 1 and applied to all workers.

see:  
https://amplab.cs.berkeley.edu/jenkins/view/RISELab%20Infra/job/testing-foo/9/console
  (the 'echo 1' is actually 'echo $SPARK_TESTING')

and:  https://amplab.cs.berkeley.edu/jenkins/job/testing-foo/10/injectedEnvVars/

this is problematic, as some of our lab builds are attempting to run pyspark as 
part of the build process, and the hard-coded checks for SPARK_TESTING in the 
setup scripts are causing hard failures.

see:  
https://amplab.cs.berkeley.edu/jenkins/job/ADAM-prb/2440/HADOOP_VERSION=2.6.2,SCALAVER=2.11,SPARK_VERSION=2.2.0,label=centos/consoleFull

i would strongly suggest that we do the following:
* remove the SPARK_TESTING environment variable declaration in the jenkins 
config
* add the environment variable to each spark build config in github:  
https://github.com/databricks/spark-jenkins-configurations/
* add the environment variable to SparkPullRequstBuilder and 
NewSparkPullRequestBuilder



  was:
in the riselab jenkins master config, the SPARK_TESTING environment variable is 
set to 1 and applied to all workers.

see:  
https://amplab.cs.berkeley.edu/jenkins/view/RISELab%20Infra/job/testing-foo/9/console

(the 'echo 1' is actually 'echo $SPARK_TESTING')

this is problematic, as some of our lab builds are attempting to run pyspark as 
part of the build process, and the hard-coded checks for SPARK_TESTING in the 
setup scripts are causing hard failures.

see:  
https://amplab.cs.berkeley.edu/jenkins/job/ADAM-prb/2440/HADOOP_VERSION=2.6.2,SCALAVER=2.11,SPARK_VERSION=2.2.0,label=centos/consoleFull

i would strongly suggest that we do the following:
* remove the SPARK_TESTING environment variable declaration in the jenkins 
config
* add the environment variable to each spark build config in github:  
https://github.com/databricks/spark-jenkins-configurations/
* add the environment variable to SparkPullRequstBuilder and 
NewSparkPullRequestBuilder




> SPARK_TESTING env variable breaking non-spark builds on amplab jenkins
> ----------------------------------------------------------------------
>
>                 Key: SPARK-22325
>                 URL: https://issues.apache.org/jira/browse/SPARK-22325
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Project Infra
>    Affects Versions: 2.2.0
>         Environment: riselab jenkins, all workers (ubuntu & centos)
>            Reporter: shane knapp
>            Priority: Critical
>
> in the riselab jenkins master config, the SPARK_TESTING environment variable 
> is set to 1 and applied to all workers.
> see:  
> https://amplab.cs.berkeley.edu/jenkins/view/RISELab%20Infra/job/testing-foo/9/console
>   (the 'echo 1' is actually 'echo $SPARK_TESTING')
> and:  
> https://amplab.cs.berkeley.edu/jenkins/job/testing-foo/10/injectedEnvVars/
> this is problematic, as some of our lab builds are attempting to run pyspark 
> as part of the build process, and the hard-coded checks for SPARK_TESTING in 
> the setup scripts are causing hard failures.
> see:  
> https://amplab.cs.berkeley.edu/jenkins/job/ADAM-prb/2440/HADOOP_VERSION=2.6.2,SCALAVER=2.11,SPARK_VERSION=2.2.0,label=centos/consoleFull
> i would strongly suggest that we do the following:
> * remove the SPARK_TESTING environment variable declaration in the jenkins 
> config
> * add the environment variable to each spark build config in github:  
> https://github.com/databricks/spark-jenkins-configurations/
> * add the environment variable to SparkPullRequstBuilder and 
> NewSparkPullRequestBuilder



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to