build-environment-plugin does not appear to display on pipeline jobs

2016-08-30 Thread Wayne Warren
Does anyone know how to view the build environment for a pipeline job? Is 
this built in to the pipeline job somehow and therefore needs no support 
from build-environment-plugin? I'd be happy to submit JIRA ticket for this 
and submit a patch if the fix isn't too time-consuming.

It's pretty annoying that all I want to do is look at what environment 
variables are available for a particular build of the pipeline job and 
can't see it.

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/2a1459ac-d5fb-4d89-9ac9-5d39f78392aa%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Multiple pipelines in Jenkinsfile

2016-07-22 Thread Wayne Warren


On Thursday, July 21, 2016 at 9:42:58 AM UTC-7, Alex Kessinger wrote:
>
> I've tried the seed job-dsl method previously. Some automation was better 
> then no automation, but I think the Jenkinsfile in repo is even better. If 
> I make a change to the Jenkinsfile, that change can be isolated in each 
> environment/branch until it has been promoted to next step. If I have one 
> master seed job it's harder for me to control the promotion between 
> environments.
>
> One other thing to Note, Job DSL, and Pipeline are both groovy, but they 
> are not the same DSL. That may not matter in this case because you are 
> orchestrating pipelines with job DSL.
>

Yeah this is important to note, just as important is that there's a reason 
they are different DSLs. Specifically, they run in different contexts. One 
manages job lifecycles and gives the ability to configure an entire job 
head-to-toe (as well as views on the Jenkins instance), the other defines 
the behavior of those jobs. Pipeline is fine in the case where you want to 
sequence behavior of a particular project's test/build/deploy steps in a 
visually appealing way that is easy to configure-as-code, but what if you 
want to trigger more than one Pipeline in parallel based on a change in 
some common upstream dependency? This is where as you say, the Pipelines 
have to be orchestrated with Job DSL.

It's somewhat difficult to imagine unifying those contexts so that you 
could say have one DSL that both manages job lifecycles and multi-job 
relations and defines their behavior. In my opinion, somewhat biased by my 
desire to structure the relationships between different project pipelines, 
it would be a huge win if it were possible but as it stands Job DSL really 
does seem to fill in at least some of the blanks left by Pipeline.
 

>
> On Thu, Jul 21, 2016 at 10:17 AM, Wayne Warren  > wrote:
>
>>
>> Sorry for chiming in late here, but I have recently been evaluating 
>> Jenkins Pipeline Plugin for use at my workplace and have considered the 
>> very problem you are describing in this thread--what if a given source repo 
>> has multiple Pipeline groovy scripts it wants to use for different purposes?
>>
>> The approach I have come up with actually looks very similar to what I 
>> found described in a relatively new series of blog posts:
>>
>>
>> http://marcesher.com/2016/06/21/jenkins-as-code-registering-jobs-for-automatic-seed-job-creation/
>>
>> This blog post describes an approach that leans heavily on the use of the 
>> Job DSL Plugin to create jobs by using a "mother seed" DSL job that will 
>> create new DSL seed jobs for each registered SCM repo (the blog author 
>> describes the use of git and I use git myself but it's easy for me to 
>> imagine an approach using some alternate SCM).
>>
>> This approach allows individual SCM respositories to contain a Job DSL 
>> script that in turn defines additional jobs specific to that repository. 
>> The jobs defined in this way could be Pipeline jobs. When defining these 
>> Pipeline jobs in the Job DSL script you can specify the relative path in 
>> the repository that contains the Pipeline script you want to use.
>>
>> As far as I can tell from the description of your situation this should 
>> fit your needs perfectly. I recommend starting at the earliest blog post in 
>> that series for full context on the approach the author is describing: 
>> http://marcesher.com/2016/06/08/jenkins-as-code-creating-jenkins-jobs-with-text-not-clicks/
>>
>> The blog series appears to still be a work in progress as the author has 
>> not yet reached the point where he describes the interaction between Job 
>> DSL and Pipeline but it seems to me like this should be obvious. Job DSL 
>> defines the Pipeline jobs. Pipeline defines the behavior of those jobs.
>>
>> Good luck!
>>
>> On Tuesday, July 12, 2016 at 11:33:15 AM UTC-7, Bartłomiej Sacharski 
>> wrote:
>>>
>>>
>>>
>>> On Tuesday, July 12, 2016 at 8:16:27 PM UTC+2, Mike Rooney wrote:
>>>>
>>>> This need makes a lot of sense to us, where we have a couple related 
>>>> sub-projects (as sub directories) in a single repository. It makes sense 
>>>> that they each have their own pipeline jobs and can run on different 
>>>> schedules. I've also seen cases similar to Alex's (hi Alex!) where there 
>>>> are different tasks you want to do with a single repo that don't make 
>>>> sense 
>>>> as one pipeline job that runs together (building/testing versus a nightly 
>>>> cron-type task that runs in the

Re: Multiple pipelines in Jenkinsfile

2016-07-21 Thread Wayne Warren

Sorry for chiming in late here, but I have recently been evaluating Jenkins 
Pipeline Plugin for use at my workplace and have considered the very 
problem you are describing in this thread--what if a given source repo has 
multiple Pipeline groovy scripts it wants to use for different purposes?

The approach I have come up with actually looks very similar to what I 
found described in a relatively new series of blog posts:

http://marcesher.com/2016/06/21/jenkins-as-code-registering-jobs-for-automatic-seed-job-creation/

This blog post describes an approach that leans heavily on the use of the 
Job DSL Plugin to create jobs by using a "mother seed" DSL job that will 
create new DSL seed jobs for each registered SCM repo (the blog author 
describes the use of git and I use git myself but it's easy for me to 
imagine an approach using some alternate SCM).

This approach allows individual SCM respositories to contain a Job DSL 
script that in turn defines additional jobs specific to that repository. 
The jobs defined in this way could be Pipeline jobs. When defining these 
Pipeline jobs in the Job DSL script you can specify the relative path in 
the repository that contains the Pipeline script you want to use.

As far as I can tell from the description of your situation this should fit 
your needs perfectly. I recommend starting at the earliest blog post in 
that series for full context on the approach the author is describing: 
http://marcesher.com/2016/06/08/jenkins-as-code-creating-jenkins-jobs-with-text-not-clicks/

The blog series appears to still be a work in progress as the author has 
not yet reached the point where he describes the interaction between Job 
DSL and Pipeline but it seems to me like this should be obvious. Job DSL 
defines the Pipeline jobs. Pipeline defines the behavior of those jobs.

Good luck!

On Tuesday, July 12, 2016 at 11:33:15 AM UTC-7, Bartłomiej Sacharski wrote:
>
>
>
> On Tuesday, July 12, 2016 at 8:16:27 PM UTC+2, Mike Rooney wrote:
>>
>> This need makes a lot of sense to us, where we have a couple related 
>> sub-projects (as sub directories) in a single repository. It makes sense 
>> that they each have their own pipeline jobs and can run on different 
>> schedules. I've also seen cases similar to Alex's (hi Alex!) where there 
>> are different tasks you want to do with a single repo that don't make sense 
>> as one pipeline job that runs together (building/testing versus a nightly 
>> cron-type task that runs in the repo).
>>
>> It is reasonable that a Jenkinsfile corresponds to a single Pipeline job, 
>> because these are often associated with and run via a Pipeline job which 
>> isn't a logical "parent" of these seed jobs. However, a great place for 
>> this enhancement would be the Github Org / Bitbucket plugins that scan 
>> repositories for Jenkinsfiles and are already in the place of creating 
>> multiple Pipeline jobs. 
>>
>
>> My proposal would be: add a configuration option for the Github and 
>> Bitbucket plugins which scan organizations for Jenkinsfiles. So, "Project 
>> Recognizers -> Pipeline Jenkinsfile" would get a box for this which 
>> defaults to "Jenkinsfile". Some logical configuration examples might be, 
>> "Jenkinsfiles/*", "**/Jenkinsfile", "Jenkinsfile-*". Then the 
>> Github/Bitbucket plugins can be pointed at an org, or just one repository, 
>> and multiple Jenkinsfiles can exist which define different Pipeline jobs.
>>
>> Bartłomiej and Alex, would something like this satisfy your use cases as 
>> well?
>>
>  
>
>>
>> - Michael
>>
>> On Sunday, May 29, 2016 at 12:47:40 PM UTC-5, Bartłomiej Sacharski wrote:
>>>
>>> I'm really hyped about the Jenkinsfiles - they make it much much easier 
>>> to document and preserve project configuration.
>>> However, all the examples that I've seen seem to use single pipeline.
>>> I've tried to define different stages in separate node blocks, however 
>>> they still were seen as a single pipeline.
>>>
>>> Is it possible to define multiple pipelines in a single Jenkinsfile? Or 
>>> maybe there's undocumented functionality for .jenkinsfile extension to 
>>> handle such cases?
>>>
>>
> IMO just having an option to specify the name of Jenkinsfile would be 
> enough - and I would rather try to implement this as a standalone thing, 
> not connected to bitbucket/github plugins (we're using Jenkins with 
> standalone repository, so source-agnostic solution would be best methinks). 
> Of course that should be available in both single-branch and multi-branch 
> variants of pipeline plugins
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/9b429ce4-cffb-47f0-b8cd-ae7b0db2465a%40googlegroups.com.
For more options, visit https://groups.google.com/d/opt

Re: start jenkins sub jobs parallelly

2016-07-18 Thread Wayne Warren
There are a couple different ways I would do this but the simplest would be 
to have a single job that triggers automatically whose primary function is 
to trigger all the jobs you want running in parallel as downstream of 
itself.

Another way would be to have this "trigger" job make use of the MultiJob 
Plugin, in which case you can pretty easily stage the execution of multiple 
sets of parallel jobs. The advantage of this plugin, even if you don't want 
to separate your parallel job runs into multiple stages is that the success 
of the job doing the triggering can depend on the success of the triggered 
jobs, so you have one job's status to check for overall success/failure.

However, I am still learning about the Pipeline (aka Workflow) Plugin and 
there may very well be a better way to express the parallel behavior you 
are looking for using it. The misleadingly-named "parallel" keyword is 
probably not it though... Here's an example of how it works: 
https://github.com/jenkinsci/pipeline-examples/blob/master/pipeline-examples/parallel-from-list/parallelFromList.groovy#L27
 
You might be able to use this in combination with the pipeline-build-step 
step: 
https://jenkins.io/doc/pipeline/steps/pipeline-build-step/#build-build-a-job

On Friday, July 15, 2016 at 9:19:50 AM UTC-7, Pedda Reddy wrote:
>
> I have 'n' no. of jobs, which I want to start simultaneously. Is it 
> feasible 
> in Jenkins? I tried using DSL plugin, work flow plugin. I have used 
> 'parallel' method. I have my list of jobnames in an array/list and want to 
> run them parallel. Please help. 
>
> Currently I'm iterating the jobnames in a for loop,they start one by one, 
> instead I want them to start parallel. How this can be achieved ? 
>
>
>
>
> -- 
> View this message in context: 
> http://jenkins-ci.361315.n4.nabble.com/start-jenkins-sub-jobs-parallelly-tp4827908.html
>  
> Sent from the Jenkins users mailing list archive at Nabble.com. 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/85e4d9e2-7d87-484f-be61-c691f44f0683%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[pipeline-plugin] [workflow-plugin] Parallelizing Jenkins Pipelines

2016-07-01 Thread Wayne Warren


So I have been experimenting with Jenkins Pipeline Plugin recently. One of 
the requirements I am evaluating it against is the ability to express 
complex relationships between the build and test steps of different 
projects and to parallelize the execution of those steps where possible.

To visualize the kind of relationship I have in mind and maybe make it more 
concrete for folks reading this thread, I have attached a simple diagram 
illustrating the heterogenous nature of the pipelines that we deal with 
within our engineering organization. To understand this diagram, think of 
the colored blocks for "Project 1", "Project 2", "Project 3", and "Uber 
Project" as consisting of "Pipeline" stages being defined in Jenkinsfiles 
found in different git repos. "Project 1" can be thought of as a library 
that does not have any upstream dependencies on any other projects in our 
organization so it therefore does not have an "integration" test stage. 
"Project 1" and "Project 2" both depend on a library artifact produced from 
"Project 1". "Project 1" and "Project 2", while not necessarily libraries, 
are components of the "Uber Project" which itself is intended to be built 
and shipped as a higher-level package, usually at the operating system 
level (rpm, deb, msi, etc); having built that package, there are then 
"acceptance" tests that each project may define to be run against that new 
artifact--once all these tests pass, it is ready to be "deployed" which you 
can think of as being staged either for manual validation in cases where 
our QA team has not had a chance to automate tests or for early access for 
customers with specific support contracts.






In the ideal world of our CI pipelining technology requirements there are 
several triggering scenarios we want to consider:

   1. SCM Event on Project 1 repo
   1. This should trigger the Pipeline for Project 1. 
  2. On successful completion, the Pipelines for both Project 2 and 
  Project 3 should be triggered.
  3. Once both Project 2 Pipeline and Project 3 Pipeline have finished, 
  Uber Project Pipeline should trigger; to be clear, the Uber Project 
  Pipeline should only trigger if both Project 2 and Project 3 Pipelines 
  finish successfully.
   2. SCM Event on Project 2 repo
   1. This should trigger the Pipeline for Project 2. Project 1 and Project 
  3 pipelines should not be triggered.
  2. On successful completion, the Uber Project Pipeline should 
  trigger; no requirement this time that Project 3 pipeline also run 
  successfully (Uber Project Pipeline should use the last successful 
artifact 
  from Project 3)
  3. SCM Event on Project 3 repo
   1. This should trigger the Pipeline for Project 3. Project 1 and Project 
  2 pipelines should not be triggered.
  2. On successful completion, the Uber Project Pipeline should 
  trigger; no requirement this time that Project 2 pipeline also run 
  successfully (Uber Project Pipeline should use the last successful 
artifact 
  from Project 2)
  4. SCM Event on Uber Project repo
  1. This should trigger the Pipeline for Uber Project. Project 1 and 
  Project 2 and Project 3 pipelines should not be triggered.
  

There are a couple problems I am currently seeing in defining such a 
pipeline that enables these scenarios:

   1. It does not seem possible to insert Pipeline stages in branches of 
   the parallel blocks; parallel blocks, instead, seem to be providing similar 
   behavior to what Matrix Project jobs provided in Jenkins 1.x. In other 
   words, there seems to be a problem of "parallel composability" of Pipeline 
   Plugin pipelines
   2. It's not clear that any one Pipeline job can trigger off of SCM 
   events coming from multiple repos, so in order to build a pipeline like 
   this it seems like each project will need to, within its own Jenkinsfile, 
   define a distinct Pipeline containing ephemeral "copies" of each set of 
   project-specific stages shown above; this is assuming, by the way, that it 
   is practical to store base pipeline components/stages as groovy methods in 
   a global workflow lib and re-use them in different project contexts. (It's 
   also possible that the best way to represent this kind of relationship 
   between Pipelines is as downstream/upstream triggers of one another.)
   
I am curious if anyone else has come across these problems when attempting 
to organize their Pipeline code. Is this kind of pipeline composability a 
feature that the Jenkins community is interested in working on as a goal 
for Pipeline Plugin? 

For what it's worth, I have considered this alternate approach to composing 
different projects together using Pipeline:




Re: Jenkins Job Builder vs. "Jenkins 2.0 Pipeline as Code"

2016-05-09 Thread Wayne Warren


On Friday, May 6, 2016 at 11:07:44 AM UTC-7, David Karr wrote:
>
> I work in an OpenStack-related community that uses "Jenkins Job Builder" 
> to build Jenkins jobs.  I don't know a lot about it, but I did manage, 
> with a lot of help from our infra team, to make the changes required to 
> build the new project I was adding. 
>
> I'm also pretty familiar with Groovy, although I haven't done much with 
> Groovy in Jenkins besides viewing several presentations on how it works. 
>
> I'm really interested in any factual comparisons of JJB with this 
> "Pipeline as Code" concept in Jenkins 2.0.  Seeing some of the other 
> schisms between the communities I work in, I imagine there aren't too 
> many people who intimately understand both sides of this, but I'd 
> appreciate if anyone has any detailed examination of this. 
>
> One particular thing that I wonder about, how "testable" or "verifiable" 
> are scripts using this "new" paradigm, compared to what a production JJB 
> script would look like?  I sometimes hear about JJB scripts that fail at 
> runtime because there were details that couldn't be verified on the 
> desktop.  Does "Pipeline as Code" deal with this at all?
>

I can't speak about Pipeline at all since I have not used it.

As for JJB, I can speak about that because I both use it and I am one of 
its core developers. As for "JJB scripts that fail at runtime" I might be 
misunderstanding but I think you are referring to the possibility that running 
JJB in "test" mode 

 
(dumping generated Job configuration XML to STDOUT in your local terminal) 
can produce different XML than what is produced when running in "update" 
mode (updating actual job configuration on a running Jenkins instance by 
POSTing to the job's HTTP API endpoint).

This can happen, for example, when you run JJB in test mode without telling 
it which Jenkins instance it would be sending its generated XML to. The 
reason for this is that we have built in support in Jenkins Job Builder for 
checking which plugins and their versions are installed on a particular 
Jenkins instance. (Specifically, the one specified in the jenkins section 
of
 
your JJB configuration file)

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/400cea6b-eac2-4a74-a0e6-d07218c7656e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.