I researched this years ago and couldn't find a viable solution. Rather than pulling artifacts from downstream jobs I opted to push them by specifying a parameter with the output directory path. In my case, the jobs are all building with Ant. The parent job would build, then call on the other jobs with the parameter $OUTPUT_DIR=$WORKSPACE/some_directory. The child jobs would run, passing the parameter to Ant. These in turn would save their artifacts in the parent job. Once all child jobs were complete the parent would continue final processing then artifact everything in $WORKSPACE/some_directory.

The downside to this is all child jobs must run on the same node. Hope this helps!

On 2014-09-09 04:23, Omer Weissman wrote:
Hello,

We are trying to copy files from multiple downstream jobs back to the
main job.
The main job triggers the builds using a parametrized trigger "invoke
i=0...N builds"
We are doing it inorder to achieve parallelization of a long task.

I could not find a way to copy files back, as the copy artifacts
plugin does not recognize all the jobs , but only the last one.
the copy behavior I am trying to achieve is similar to the way
"parallel test executor" plugin copy the specified files back to the
main job.

Is there a way to do it ? did I miss something in the copy artifacts
plugin ?

Thanks,
Omer

 --
 You received this message because you are subscribed to the Google
Groups "Jenkins Users" group.
 To unsubscribe from this group and stop receiving emails from it,
send an email to jenkinsci-users+unsubscr...@googlegroups.com.
 For more options, visit https://groups.google.com/d/optout [1].


Links:
------
[1] https://groups.google.com/d/optout

--
You received this message because you are subscribed to the Google Groups "Jenkins 
Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to