[ 
https://issues.apache.org/jira/browse/SPARK-8415?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14624058#comment-14624058
 ] 

Josh Rosen commented on SPARK-8415:
-----------------------------------

I figured out how to configure AMPLab Jenkins to use a separate ivy cache for 
each pull request builder workspace.  In the Jenkins environment / properties 
injection, I adeded the following lines

{code}
HOME=/home/sparkivy/${JOB_NAME}_${EXECUTOR_NUMBER}
SBT_OPTS=-Duser.home=/home/sparkivy/${JOB_NAME}_${EXECUTOR_NUMBER} 
-Dsbt.ivy.home=/home/sparkivy/${JOB_NAME}_${EXECUTOR_NUMBER}/.ivy2
{code}

Here, {{/home/sparkivy}} is a directory that's outside of the build workspace 
so it won't be deleted by the {{git clean -fdx}} in our Jenkins build.  The 
substitutions ensure that each build gets its own independent directory.  I'm 
going to mark this issue as resolved since I'm switching the main 
SparkPullRequestBuilder to use this configuration change. 

> Jenkins compilation spends lots of time re-resolving dependencies and waiting 
> to acquire Ivy cache lock
> -------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8415
>                 URL: https://issues.apache.org/jira/browse/SPARK-8415
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Project Infra
>            Reporter: Josh Rosen
>
> When watching a pull request build, I noticed that the compilation + 
> packaging + test compilation phases spent huge amounts of time waiting to 
> acquire the Ivy cache lock.  We should see whether we can tell SBT to skip 
> the resolution steps for some of these commands, since this could speed up 
> the compilation process when Jenkins is heavily loaded.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to