Re: [Declarative Pipeline] Option to override --entry-point cat

2017-04-12 Thread Dan Tran
I ended up to start mongod in background at my Jenkinsfile

-D

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/b8fdb7f1-58a6-46b6-9388-91eefabb372b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Declarative pipelines per branch and reusable stages (keeping it DRY)

2017-04-12 Thread Kenneth Brooks
oh, and thanks for the tip on the readTrusted! Totally missed that one, and 
yes, load is problematic because it requires the workspace (and thus a node)

On Wednesday, April 12, 2017 at 4:53:23 PM UTC-4, Patrick Wolf wrote:
>
> Feel free to open a JIRA ticket but I'm not a huge fan of this because it 
> is counter to the KISS principle we wanted with Declarative and breaks the 
> Blue Ocean editor.  We have discussed having multiple "stages" blocks but 
> rejected that because it quickly becomes needlessly complex without adding 
> any use case coverage. IMO, having multiple "stages" makes much more sense 
> than having multiple "pipelines" or else you will have to recreate all 
> agent, environment, libraries, options, parameters etc for each pipeline 
> and that leads to wanting those sections being DRY as well and Declarative 
> pretty much falls apart completely.
>
> BTW, It is already possible to have multiple 'pipeline' closures in a 
> single Jenkinsfile but they will be treated as parts of a whole Pipeline 
> and this cannot be used in the editor.  Because the Jenkinsfile is treated 
> as one continuous Pipeline anything outside of the pipeline closures is 
> interpreted as Scripted Pipeline. This means you can use 'if' blocks around 
> the separate 'pipeline' blocks instead of using 'load' if you choose but 
> keeping them in separate files makes maintenance easier, I think.
>
> if (BRANCH_NAME.startsWith("develop")) {
> pipeline {  }
> } 
>
>
> Also, it's worth noting that 'readTrusted' probably works better than 
> 'load' because this takes the committer into account and it doesn't require 
> a workspace.
>
>
> https://jenkins.io/doc/pipeline/steps/workflow-multibranch/#code-readtrusted-code-read-trusted-file-from-scm
>
> As for DRY stages there are several ways to accomplish this with Pipeline.
>
> 1. Shared Library and Resources - This is the preferred method of creating 
> DRY routines
>
> You create a global variable that has all of the steps you want (with 
> appropriate variable replacement for environment variables). You could have 
> a build.groovy global variable in the /vars directory that does all of your 
> build steps. Then the steps in your stage can be single line.
>
> Alternatively, you can store shell scripts in the /resources of your 
> shared library and run those in your steps without having to duplicate 
> anything:
>
> https://gist.github.com/HRMPW/92231e7b2344f20d9cc9d5f2eb778a54
>
> 2. You can define your steps directly in the Jenkinsfile at the top level 
> either as strings or methods and simply call that method from with each 
> pipeline.
>
> 3. You can define your steps in a configuration file as a property or yaml 
> and load those files using the Pipeline utility steps plugin. 
> https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Utility+Steps+Plugin
>
> To sum up, I think having different stages is worth discussing (it is not 
> going to be implemented in the short term) but there are already many 
> existing ways to make Pipelines DRY.
>
> On Tuesday, April 11, 2017 at 8:43:49 AM UTC-7, Kenneth Brooks wrote:
>>
>> TL;DR up front:
>> *As a user, I want to have a pipeline that performs specific pipeline 
>> stages based on the branch. Recommendation: Put the when{} condition 
>> outside the pipeline{} tag.*
>> *As a user, I want to declare my stages but have the implementation be 
>> separate so that I can reuse them in multiple pipelines*. 
>>
>> Currently the Declarative syntax has the ability to perform a stage 
>> conditionally using 'when' but not a whole pipeline.
>> This leads to making the pipeline fairly inflexible and much harder to 
>> read thru.
>>
>> Take for example:
>>
>> pipeline {
>>
>>stages {
>>  stage('Build') {
>>when { branch "develop || master || feature"} // no the real syntax, 
>> i know
>>steps { /* do some build stuff */ }
>>  }
>>
>>  stage('Scan') {
>>when { branch "master"}
>>steps { /* run static code analysis or other code scanning */}
>>  }
>>
>>  stage('Pull Request Build') {
>>when { branch "PR-*"}
>>steps { /* do a merge build stuff */ }
>>  }
>>
>>  stage('Dev Deploy') {
>>when { branch "develop || master"}
>>steps { /* deploy to dev */ }
>>  }
>>
>>  stage('Pull Request Deploy') {
>>when { branch "PR-*"}
>>steps { /* deploy to special PR sandbox */}
>>  }
>>   }
>> }
>>
>>
>> In this simple example, the following will happen, but it is extremely hard 
>> to follow.
>>
>> Feature -> Build
>> Master -> Build, Scan, Dev Deploy
>> Develop -> Build, Dev Deploy
>> Pull Request -> Pull Request Build, Pull Request Deploy
>>
>> I would suggest we allow the when to be placed at the pipeline level 
>> somehow.args
>>
>> pipeline('master') { // Just for naming
>>   when { branch "master" }
>>   stages {
>> stage('Build'){
>>   steps { /* do some build stuff */ }
>> }
>> stage('Scan'){
>> 

Re: Declarative pipelines per branch and reusable stages (keeping it DRY)

2017-04-12 Thread Kenneth Brooks
Quickly trying out your suggestion of pipeline directly inside the if. It 
sees the stages, but something is not right.
It doesn't see/execute the environment section (which means the use of 
credentials('cred') isn't being loaded.

Here is what I see when the pipeline {} is on it's own:

[Pipeline] withEnv
[Pipeline] {
[Pipeline] withCredentials
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Feature Build)


Here is what I see when I wrap a simple if around the pipeline:

[Pipeline] stage
[Pipeline] { (Feature Build)
[Pipeline] node


Am I missing something or should I file a bug?




On Wednesday, April 12, 2017 at 4:53:23 PM UTC-4, Patrick Wolf wrote:
>
> Feel free to open a JIRA ticket but I'm not a huge fan of this because it 
> is counter to the KISS principle we wanted with Declarative and breaks the 
> Blue Ocean editor.  We have discussed having multiple "stages" blocks but 
> rejected that because it quickly becomes needlessly complex without adding 
> any use case coverage. IMO, having multiple "stages" makes much more sense 
> than having multiple "pipelines" or else you will have to recreate all 
> agent, environment, libraries, options, parameters etc for each pipeline 
> and that leads to wanting those sections being DRY as well and Declarative 
> pretty much falls apart completely.
>
> BTW, It is already possible to have multiple 'pipeline' closures in a 
> single Jenkinsfile but they will be treated as parts of a whole Pipeline 
> and this cannot be used in the editor.  Because the Jenkinsfile is treated 
> as one continuous Pipeline anything outside of the pipeline closures is 
> interpreted as Scripted Pipeline. This means you can use 'if' blocks around 
> the separate 'pipeline' blocks instead of using 'load' if you choose but 
> keeping them in separate files makes maintenance easier, I think.
>
> if (BRANCH_NAME.startsWith("develop")) {
> pipeline {  }
> } 
>
>
> Also, it's worth noting that 'readTrusted' probably works better than 
> 'load' because this takes the committer into account and it doesn't require 
> a workspace.
>
>
> https://jenkins.io/doc/pipeline/steps/workflow-multibranch/#code-readtrusted-code-read-trusted-file-from-scm
>
> As for DRY stages there are several ways to accomplish this with Pipeline.
>
> 1. Shared Library and Resources - This is the preferred method of creating 
> DRY routines
>
> You create a global variable that has all of the steps you want (with 
> appropriate variable replacement for environment variables). You could have 
> a build.groovy global variable in the /vars directory that does all of your 
> build steps. Then the steps in your stage can be single line.
>
> Alternatively, you can store shell scripts in the /resources of your 
> shared library and run those in your steps without having to duplicate 
> anything:
>
> https://gist.github.com/HRMPW/92231e7b2344f20d9cc9d5f2eb778a54
>
> 2. You can define your steps directly in the Jenkinsfile at the top level 
> either as strings or methods and simply call that method from with each 
> pipeline.
>
> 3. You can define your steps in a configuration file as a property or yaml 
> and load those files using the Pipeline utility steps plugin. 
> https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Utility+Steps+Plugin
>
> To sum up, I think having different stages is worth discussing (it is not 
> going to be implemented in the short term) but there are already many 
> existing ways to make Pipelines DRY.
>
> On Tuesday, April 11, 2017 at 8:43:49 AM UTC-7, Kenneth Brooks wrote:
>>
>> TL;DR up front:
>> *As a user, I want to have a pipeline that performs specific pipeline 
>> stages based on the branch. Recommendation: Put the when{} condition 
>> outside the pipeline{} tag.*
>> *As a user, I want to declare my stages but have the implementation be 
>> separate so that I can reuse them in multiple pipelines*. 
>>
>> Currently the Declarative syntax has the ability to perform a stage 
>> conditionally using 'when' but not a whole pipeline.
>> This leads to making the pipeline fairly inflexible and much harder to 
>> read thru.
>>
>> Take for example:
>>
>> pipeline {
>>
>>stages {
>>  stage('Build') {
>>when { branch "develop || master || feature"} // no the real syntax, 
>> i know
>>steps { /* do some build stuff */ }
>>  }
>>
>>  stage('Scan') {
>>when { branch "master"}
>>steps { /* run static code analysis or other code scanning */}
>>  }
>>
>>  stage('Pull Request Build') {
>>when { branch "PR-*"}
>>steps { /* do a merge build stuff */ }
>>  }
>>
>>  stage('Dev Deploy') {
>>when { branch "develop || master"}
>>steps { /* deploy to dev */ }
>>  }
>>
>>  stage('Pull Request Deploy') {
>>when { branch "PR-*"}
>>steps { /* deploy to special PR sandbox */}
>>  }
>>   }
>> }
>>
>>
>> In this simple example, the following will happen, but it is extremely hard 
>> to follow.
>>

Re: Declarative Syntax: ability to run multiple stages inside a single agent

2017-04-12 Thread Kenneth Brooks
Thanks for the quick reply.

Unfortunately no.
We will have lots of other stages down the line that require different 
capabilities.

Here is a little larger example for illustrative purposes (using script 
syntax so you can see where the nodes exist)
** Note - Each node will be spun up with exactly the required 
binaries/capabilities needed thus you see all the stages with different 
node labels. That way each pipeline can move without being tied to a 
specific version of a tool on a specific agent.

node('java-1.7') {
   stage ('Build') {
   checkout scm
   // do some build stuff
   }

stage ('Submit to Sonar') {
   // do some mvn stuff to submit to sonar
   }

stage ('Submit for OSS Scanning'){
   // do some work for uploading to OSS scanning system
   }
}

stage ('Dev Deploy'){
 node('deploy-utility-1.0') {
   // do some deployment using our homegrown deployment binaries on this 
node
 }
}

stage ('Functional Tests'){
 node('java-1.7 && apache-maven-3.2.5') {
   // run test against a selenium grid
 }
 
 node('legacy-test-system-1.0'){
  // run some other functional tests using older stuff
 }
}

stage ('Performance Tests'){
 node('perf-test-1.0 && java-1.7 && apache-maven-3.2.5') {
   // run test using our internal perf test capabilities installed on this 
node
 }
}

stage ('UAT Deploy'){
 node('deploy-utility-1.0') {
 // do some deployment using our homegrown deployment binaries on this 
node
 }
}

stage ('Security Test'){
 node('java-1.8') {
 // do some security testing using something that requires java 1.8
 }
}

stage ('PROD Deploy'){
 node('deploy-utility-1.0') {
// do some deployment using our homegrown deployment binaries on this 
node
 }
}

-k

On Wednesday, April 12, 2017 at 4:25:51 PM UTC-4, Andrew Bayer wrote:
>
> Are you wanting to use the same node for most, if not all, of the 
> pipeline? If so, just use the top level agent directive.
>
> A.
>
> On Wed, Apr 12, 2017 at 1:18 PM Kenneth Brooks  > wrote:
>
>> As a user, I want to run multiple stages inside a single agent.
>>
>> Today we use the mesos plugin and spin up and spin down an agent on 
>> demand. I want to have 3 stages all run on that same agent before it spins 
>> down.
>> That way all stages can leverage that workspace before it is trashed.
>>
>> Current ability in script syntax:
>> node('java-1.8.0_45') {
>> stage ('Build') {
>> checkout scm
>> // do some build stuff
>> }
>>
>> stage ('Submit to Sonar') {
>> // do some mvn stuff to submit to sonar
>> }
>>
>> stage ('Submit for OSS Scanning'){
>> // do some work for uploading to OSS scanning system
>> }
>> }
>>
>> All of this happens on the node and I see it as 3 stages on my pipeline.
>>
>> With Declarative:
>> stages {
>> stage ('Build and Scan') {
>> agent { label "java-1.8.0_45" }
>> steps {
>> checkout scm
>> // do some build stuff
>>
>> // do some scan stuff
>>
>> // do some oss scan stuff
>> }
>> }
>>
>> I no longer can specify them to all happen in the same node unless I do 
>> this above, which then I can no longer define the stages.
>>
>>
>> Is there something I'm missing here?
>> Keep in mind the important part that the node is spun up and trashed on 
>> every invocation so we can't just get ahold of the previous/same agent 
>> again.
>>
>> Thanks,
>> Ken
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Jenkins Users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to jenkinsci-use...@googlegroups.com .
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/jenkinsci-users/bba6cb94-f39c-4923-a400-027fb36a043f%40googlegroups.com
>>  
>> 
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/a95147ae-e31d-405a-97eb-b86726f1116e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Getting an exception when using Job import plugin

2017-04-12 Thread Christopher Orr
Sounds like https://issues.jenkins-ci.org/browse/JENKINS-40577


On Tue, 11 Apr 2017, at 00:21, Prasu S wrote:
> I'm trying to import jobs from Jenkins 1.658 to Jenkins 2.45 . I receive 
> the below error. Can anyone please help.
> 
> java.lang.NoClassDefFoundError: 
> com/google/inject/internal/guava/base/$Preconditions
> at 
> org.jenkins.ci.plugins.jobimport.CredentialsUtils$NullSafeCredentials.(CredentialsUtils.java:50)
> at 
> org.jenkins.ci.plugins.jobimport.CredentialsUtils.getCredentials(CredentialsUtils.java:29)
> at 
> org.jenkins.ci.plugins.jobimport.JobImportAction.doQuery(JobImportAction.java:168)
> at sun.reflect.GeneratedMethodAccessor342.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> org.kohsuke.stapler.Function$InstanceFunction.invoke(Function.java:335)
> at org.kohsuke.stapler.Function.bindAndInvoke(Function.java:175)
> at 
> org.kohsuke.stapler.Function.bindAndInvokeAndServeResponse(Function.java:108)
> at org.kohsuke.stapler.MetaClass$1.doDispatch(MetaClass.java:124)
> at 
> org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:58)
> at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:746)
> Caused: javax.servlet.ServletException
> at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:796)
> at org.kohsuke.stapler.Stapler.invoke(Stapler.java:876)
> at org.kohsuke.stapler.MetaClass$10.dispatch(MetaClass.java:362)
> at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:746)
> at org.kohsuke.stapler.Stapler.invoke(Stapler.java:876)
> at org.kohsuke.stapler.Stapler.invoke(Stapler.java:649)
> at org.kohsuke.stapler.Stapler.service(Stapler.java:238)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
> at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1669)
> at
> hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:135)
> at 
> com.smartcodeltd.jenkinsci.plugin.assetbundler.filters.LessCSS.doFilter(LessCSS.java:47)
> at
> hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:132)
> at jenkins.metrics.impl.MetricsFilter.doFilter(MetricsFilter.java:125)
> at
> hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:132)
> at javax.servlet.FilterChain$doFilter.call(Unknown Source)
> at 
> com.ceilfors.jenkins.plugins.jiratrigger.ExceptionLoggingFilter.doFilter(ExceptionLoggingFilter.groovy:29)
> at
> hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:132)
> at 
> hudson.plugins.audit_trail.AuditTrailFilter.doFilter(AuditTrailFilter.java:95)
> at
> hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:132)
> at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:126)
> at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
> at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:86)
> at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:84)
> at 
> hudson.security.UnwrapSecurityExceptionFilter.doFilter(UnwrapSecurityExceptionFilter.java:51)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> jenkins.security.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:117)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> org.acegisecurity.providers.anonymous.AnonymousProcessingFilter.doFilter(AnonymousProcessingFilter.java:125)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> org.acegisecurity.ui.rememberme.RememberMeProcessingFilter.doFilter(RememberMeProcessingFilter.java:142)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> org.acegisecurity.ui.AbstractProcessingFilter.doFilter(AbstractProcessingFilter.java:271)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> jenkins.security.BasicHeaderProcessor.doFilter(BasicHeaderProcessor.java:93)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(HttpSessionContextIntegrationFilter.java:249)
> at 
> hudson.security.HttpSessionContextIntegrationFilter2.doFilter(HttpSessionContextIntegrationFilter2.java:67)
> at 
> hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
> at 
> hudson.security.ChainedServletFilter.doFilter(ChainedServletFilter.java:76)
> at hudson.security.HudsonFilter.doFilter(HudsonFilter.java:171)
> at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
> at 
> 

Re: [Declarative Pipeline] Option to override --entry-point cat

2017-04-12 Thread Dan Tran
Please disregard the below incorrect statement :-)

On Wednesday, April 12, 2017 at 2:12:27 PM UTC-7, Dan Tran wrote:
>
> I meant to run /usr/bin/mongonot the mongod
>
>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/4e832f65-36dd-4371-a91b-0a2d09d1d25f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [Declarative Pipeline] Option to override --entry-point cat

2017-04-12 Thread Dan Tran
I meant to run /usr/bin/mongonot the mongod


>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/8cbb2112-8b68-4792-ac10-56139f12376b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Declarative pipelines per branch and reusable stages (keeping it DRY)

2017-04-12 Thread Patrick Wolf
Feel free to open a JIRA ticket but I'm not a huge fan of this because it 
is counter to the KISS principle we wanted with Declarative and breaks the 
Blue Ocean editor.  We have discussed having multiple "stages" blocks but 
rejected that because it quickly becomes needlessly complex without adding 
any use case coverage. IMO, having multiple "stages" makes much more sense 
than having multiple "pipelines" or else you will have to recreate all 
agent, environment, libraries, options, parameters etc for each pipeline 
and that leads to wanting those sections being DRY as well and Declarative 
pretty much falls apart completely.

BTW, It is already possible to have multiple 'pipeline' closures in a 
single Jenkinsfile but they will be treated as parts of a whole Pipeline 
and this cannot be used in the editor.  Because the Jenkinsfile is treated 
as one continuous Pipeline anything outside of the pipeline closures is 
interpreted as Scripted Pipeline. This means you can use 'if' blocks around 
the separate 'pipeline' blocks instead of using 'load' if you choose but 
keeping them in separate files makes maintenance easier, I think.

if (BRANCH_NAME.startsWith("develop")) {
pipeline {  }
} 


Also, it's worth noting that 'readTrusted' probably works better than 
'load' because this takes the committer into account and it doesn't require 
a workspace.

https://jenkins.io/doc/pipeline/steps/workflow-multibranch/#code-readtrusted-code-read-trusted-file-from-scm

As for DRY stages there are several ways to accomplish this with Pipeline.

1. Shared Library and Resources - This is the preferred method of creating 
DRY routines

You create a global variable that has all of the steps you want (with 
appropriate variable replacement for environment variables). You could have 
a build.groovy global variable in the /vars directory that does all of your 
build steps. Then the steps in your stage can be single line.

Alternatively, you can store shell scripts in the /resources of your shared 
library and run those in your steps without having to duplicate anything:

https://gist.github.com/HRMPW/92231e7b2344f20d9cc9d5f2eb778a54

2. You can define your steps directly in the Jenkinsfile at the top level 
either as strings or methods and simply call that method from with each 
pipeline.

3. You can define your steps in a configuration file as a property or yaml 
and load those files using the Pipeline utility steps 
plugin. 
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Utility+Steps+Plugin

To sum up, I think having different stages is worth discussing (it is not 
going to be implemented in the short term) but there are already many 
existing ways to make Pipelines DRY.

On Tuesday, April 11, 2017 at 8:43:49 AM UTC-7, Kenneth Brooks wrote:
>
> TL;DR up front:
> *As a user, I want to have a pipeline that performs specific pipeline 
> stages based on the branch. Recommendation: Put the when{} condition 
> outside the pipeline{} tag.*
> *As a user, I want to declare my stages but have the implementation be 
> separate so that I can reuse them in multiple pipelines*. 
>
> Currently the Declarative syntax has the ability to perform a stage 
> conditionally using 'when' but not a whole pipeline.
> This leads to making the pipeline fairly inflexible and much harder to 
> read thru.
>
> Take for example:
>
> pipeline {
>
>stages {
>  stage('Build') {
>when { branch "develop || master || feature"} // no the real syntax, i 
> know
>steps { /* do some build stuff */ }
>  }
>
>  stage('Scan') {
>when { branch "master"}
>steps { /* run static code analysis or other code scanning */}
>  }
>
>  stage('Pull Request Build') {
>when { branch "PR-*"}
>steps { /* do a merge build stuff */ }
>  }
>
>  stage('Dev Deploy') {
>when { branch "develop || master"}
>steps { /* deploy to dev */ }
>  }
>
>  stage('Pull Request Deploy') {
>when { branch "PR-*"}
>steps { /* deploy to special PR sandbox */}
>  }
>   }
> }
>
>
> In this simple example, the following will happen, but it is extremely hard 
> to follow.
>
> Feature -> Build
> Master -> Build, Scan, Dev Deploy
> Develop -> Build, Dev Deploy
> Pull Request -> Pull Request Build, Pull Request Deploy
>
> I would suggest we allow the when to be placed at the pipeline level 
> somehow.args
>
> pipeline('master') { // Just for naming
>   when { branch "master" }
>   stages {
> stage('Build'){
>   steps { /* do some build stuff */ }
> }
> stage('Scan'){
>   steps { /* run static code analysis or other code scanning */}
> }
> stage('Dev Deploy'){
>   steps { /* deploy to dev */ }
> }
>   }
> }
>
> pipeline('develop') { // Just for naming
>   when { branch "develop" }
>   stages {
> stage('Build'){
>   steps { /* do some build stuff */ }
> }
> stage('Dev Deploy'){
>   steps { /* deploy to dev */ }
> }
>   }

Re: Declarative Syntax: ability to run multiple stages inside a single agent

2017-04-12 Thread Andrew Bayer
Are you wanting to use the same node for most, if not all, of the pipeline?
If so, just use the top level agent directive.

A.

On Wed, Apr 12, 2017 at 1:18 PM Kenneth Brooks 
wrote:

> As a user, I want to run multiple stages inside a single agent.
>
> Today we use the mesos plugin and spin up and spin down an agent on
> demand. I want to have 3 stages all run on that same agent before it spins
> down.
> That way all stages can leverage that workspace before it is trashed.
>
> Current ability in script syntax:
> node('java-1.8.0_45') {
> stage ('Build') {
> checkout scm
> // do some build stuff
> }
>
> stage ('Submit to Sonar') {
> // do some mvn stuff to submit to sonar
> }
>
> stage ('Submit for OSS Scanning'){
> // do some work for uploading to OSS scanning system
> }
> }
>
> All of this happens on the node and I see it as 3 stages on my pipeline.
>
> With Declarative:
> stages {
> stage ('Build and Scan') {
> agent { label "java-1.8.0_45" }
> steps {
> checkout scm
> // do some build stuff
>
> // do some scan stuff
>
> // do some oss scan stuff
> }
> }
>
> I no longer can specify them to all happen in the same node unless I do
> this above, which then I can no longer define the stages.
>
>
> Is there something I'm missing here?
> Keep in mind the important part that the node is spun up and trashed on
> every invocation so we can't just get ahold of the previous/same agent
> again.
>
> Thanks,
> Ken
>
> --
> You received this message because you are subscribed to the Google Groups
> "Jenkins Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to jenkinsci-users+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/jenkinsci-users/bba6cb94-f39c-4923-a400-027fb36a043f%40googlegroups.com
> 
> .
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/CAPbPdObPXLW26M4g%3DDKbmcXKQYcVoL1sr7fkqRann429qrGjGQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


Declarative Syntax: ability to run multiple stages inside a single agent

2017-04-12 Thread Kenneth Brooks
As a user, I want to run multiple stages inside a single agent.

Today we use the mesos plugin and spin up and spin down an agent on demand. 
I want to have 3 stages all run on that same agent before it spins down.
That way all stages can leverage that workspace before it is trashed.

Current ability in script syntax:
node('java-1.8.0_45') {
stage ('Build') {
checkout scm
// do some build stuff
}

stage ('Submit to Sonar') {
// do some mvn stuff to submit to sonar
}

stage ('Submit for OSS Scanning'){
// do some work for uploading to OSS scanning system
}
}

All of this happens on the node and I see it as 3 stages on my pipeline.

With Declarative:
stages {
stage ('Build and Scan') {
agent { label "java-1.8.0_45" }
steps {
checkout scm
// do some build stuff

// do some scan stuff

// do some oss scan stuff
}
}

I no longer can specify them to all happen in the same node unless I do 
this above, which then I can no longer define the stages.


Is there something I'm missing here?
Keep in mind the important part that the node is spun up and trashed on 
every invocation so we can't just get ahold of the previous/same agent 
again.

Thanks,
Ken

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/bba6cb94-f39c-4923-a400-027fb36a043f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Declarative Syntax: agent scope

2017-04-12 Thread Kenneth Brooks
When I specify an agent inside a stage, it then also applies to everything 
in the 'post' section.

I'm not sure that is alway the behavior I want. 
Is there a way around this?

Specifically, we have the ability to take a checkpoint (similar to the 
enterprise edition checkpoint feature) but that should run outside the node.

Currently we can do this in script world:

node('java-1.8.0_45') {
  stage ('Master Build') {
checkout scm
sh 'do some building'
 }
}
captureSnapshot('BuildComplete')

node('java-1.8.0_45') {
  stage ('Dev Deploy') {
sh 'do some deploying'
 }
}
captureSnapshot('DeployComplete')

In declarative syntax land, the  post actions

pipeline {
agent none   
stages {

stage ('Build') {
agent { label "java-1.8.0_45" }
steps {
checkout scm
sh 'do some building'
}
post {
success {
captureSnapshot('BuildComplete') // this is still 
inside the node
}
}
}

stage ('Dev Deploy') {
agent { label "java-1.8.0_45" }
steps {
checkout scm
sh 'do some deploying'
}
post {
success {
captureSnapshot('DeployComplete') // this is still 
inside the node
}
}
}
}
}


Do I have to setup another stage specifically for the captureSnapshot? I 
really don't want to see that stage visually show up on my pipeline all 
over the place.

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/c8d0ae35-2c58-49c8-9071-8adc02b8aaed%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[Declarative Pipeline] Option to override --entry-point cat

2017-04-12 Thread Dan Tran

Hi,

This is my docker agent declaration

  agent {

  docker {

  image "*xxx*/**-build:1.0.0"

  *args* "--*entrypoint* /*usr*/bin/*mongod*"

  }

  } 


However, docker pipeline ignores it. Here is the command from log


$ docker run -t -d -u 29307:1002 --entrypoint /usr/bin/mongod -i . 
--entrypoint cat xxx/yyy-build:1.0.0 


Is there a workaround for this?  If not, is it reasonable to file a feature 
request?


Thanks


-D

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/c2ad2faf-08b2-479b-ab06-c5b6c4ff8270%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


How to template HTML to create custom report?

2017-04-12 Thread Trever
I want to be able to have an HTML template of some sort the I can populate 
with values from another source (in my case, a yaml file).  Then I would 
use HTML publish plugin to add the HTML to my job result.  I've googled 
around to see what kind of options I may have to do this but nothing stands 
out.  

I'm looking for suggestions.

Thanks

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/b91e8fa5-9b9c-4dd3-ac85-a25cbc4ca847%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: How to start a job from another job in a declarative pipeline?

2017-04-12 Thread Idan Adar
Yep, found "build" as well. I thought maybe it's done differently in 
declarative pipelines...

Ended up with the following, however despite this, it did not start.

post {
   // Run end-to-end tests, if requested
   success {
  script {
 if (BRANCH_NAME == "develop") {
result = sh (script: "git log -1 | grep '.*\\[e2e\\].*'", 
returnStatus: true) 
if (result == 0) {
   build job: '', wait: false
}
 }
  }
   }
}

The following was logged: 

+ git log -1
+ grep '.*\[e2e\].*'
Run tests [e2e][Pipeline] build (Building )Scheduling item: * 
***


But when visiting that job's Stages View, there is no build running.
Could this be because the end-to-end job is also scheduled to run at 
specific time? Its Jenksinfile has the following specified: 

pipeline {
triggers {

cron ('0 12,16,20 * * 1-4,7')
}
...


(run everyday except Friday and Saturday, at 12:00, 16:00 and 20:00).

On Wednesday, April 12, 2017 at 11:40:41 AM UTC+3, Danny Rehelis wrote:
>
> This is how I do this, should be easy enough -
>
> pipeline {
> stages {
> stage ('xxx') {
> steps {
> ...
> ...
> build job: '', parameters: [
> string(name: 'PARAM1', value: "xxx"),
> string(name: 'PARAM2, value: "yyy")
> ]
> }
>   }
> }
>
>
> On Wed, Apr 12, 2017 at 10:53 AM Idan Adar  
> wrote:
>
>> Hi,
>>
>> Lets assume there are two job:
>>
>> 1. a job for a micro-service repository
>> 2. a job for end-to-end tests
>>
>> I'd like, in specific cases, to start the end-to-end tests job from the 
>> micro-service job.
>> For example, after introducing a change that even though passed unit 
>> testing and integration testing, it needs further testing, available in the 
>> end-to-end tests job.
>>
>> I'd like for the developers to make a commit with a specific phrase, e.g. 
>> "[e2e]" and to use it as follows (just pseudo code for now).
>>
>> pipeline {
>> ...
>> ...
>> stages {
>> ...
>> }
>>
>>
>> post {
>> success {
>> if (BRANCH_NAME == "develop") {
>> result = sh (script: "git log -1 | grep '.*\\[e2e\\].*'", 
>> returnStatus: true) 
>> if (result == 0) {
>> // start the d2d job
>> }
>> }
>> }
>> ...
>> ...
>> }
>> }
>>
>>
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Jenkins Users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to jenkinsci-use...@googlegroups.com .
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/jenkinsci-users/db4e7ca4-4a8b-4a70-ae49-6595eda541b7%40googlegroups.com
>>  
>> 
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/aefb5f67-2901-4fa8-8630-e189320a0e8f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: How to start a job from another job in a declarative pipeline?

2017-04-12 Thread Danny Rehelis
This is how I do this, should be easy enough -

pipeline {
stages {
stage ('xxx') {
steps {
...
...
build job: '', parameters: [
string(name: 'PARAM1', value: "xxx"),
string(name: 'PARAM2, value: "yyy")
]
}
  }
}


On Wed, Apr 12, 2017 at 10:53 AM Idan Adar  wrote:

> Hi,
>
> Lets assume there are two job:
>
> 1. a job for a micro-service repository
> 2. a job for end-to-end tests
>
> I'd like, in specific cases, to start the end-to-end tests job from the
> micro-service job.
> For example, after introducing a change that even though passed unit
> testing and integration testing, it needs further testing, available in the
> end-to-end tests job.
>
> I'd like for the developers to make a commit with a specific phrase, e.g.
> "[e2e]" and to use it as follows (just pseudo code for now).
>
> pipeline {
> ...
> ...
> stages {
> ...
> }
>
>
> post {
> success {
> if (BRANCH_NAME == "develop") {
> result = sh (script: "git log -1 | grep '.*\\[e2e\\].*'",
> returnStatus: true)
> if (result == 0) {
> // start the d2d job
> }
> }
> }
> ...
> ...
> }
> }
>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Jenkins Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to jenkinsci-users+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/jenkinsci-users/db4e7ca4-4a8b-4a70-ae49-6595eda541b7%40googlegroups.com
> 
> .
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/CAObRFCULUPpX-dhmKrEmLY7XeLPzTrvt_DBKgm2_-3S2rQGBoA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


How to start a job from another job in a declarative pipeline?

2017-04-12 Thread Idan Adar
Hi,

Lets assume there are two job:

1. a job for a micro-service repository
2. a job for end-to-end tests

I'd like, in specific cases, to start the end-to-end tests job from the 
micro-service job.
For example, after introducing a change that even though passed unit 
testing and integration testing, it needs further testing, available in the 
end-to-end tests job.

I'd like for the developers to make a commit with a specific phrase, e.g. 
"[e2e]" and to use it as follows (just pseudo code for now).

pipeline {
...
...
stages {
...
}


post {
success {
if (BRANCH_NAME == "develop") {
result = sh (script: "git log -1 | grep '.*\\[e2e\\].*'", 
returnStatus: true) 
if (result == 0) {
// start the d2d job
}
}
}
...
...
}
}



-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/db4e7ca4-4a8b-4a70-ae49-6595eda541b7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


trigger external pipeline from pipeline

2017-04-12 Thread Jörg Wille
I still could not find a solution. But I am thinking of a workaround.
Is it possible to trigger a build from a different pipeline (from a 
different repository) in a pipeline?

If this is possible, I could trigger the build of the root-parent pom to 
guarantee that every dependency is build.


Am Dienstag, 11. April 2017 18:06:49 UTC+2 schrieb Jörg Wille:
>
> I have a multi-module maven project split in to different git repositories.
> I am using a jenkinsfile with declarative syntax in each repo (in each 
> branch in fact) to build each maven module. The build gets triggered from 
> commits via webhooks in gitlab.
> So far so good. Now when I commit changes to my project's API-module, I 
> also want to trigger builds for all other modules, which use the API as 
> upstream dependency.
> So, the question is, (how) can I trigger downstream dependencies builds 
> from other repositories via pipeline?
> Any maven project available using declarative pipelines, which I could use 
> as reference?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/9dc84a3e-9943-4e26-9688-092f180d3405%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.