we've used gocd for many years and it's a great product.
been having an occasional issue that is increasing as we increase
deployment frequency.
sometimes pipelines briefly get "stuck" on old dlls, meaning that sometimes
a downstream pipeline will fail to run because it's been packaged with
older dlls. it's only between upstream and downstream pipelines, never in
the same pipeline. this occurs infrequently infrequently but perhaps as
much as 1 out of every 5 builds.
the workaround fix is to run a script on all the agents that periodically
refreshes and rebuilds all the pipelines manually. not sure why this works
but it always does.
haven't been able to figure out the cause, i'm wondering if it's a
misunderstanding about artifacts, or otherwise misconfigured artifacts?
here's the situation:
- we have a pipeline template that runs 8 or 9 pipelines
- the template (and thus every pipeline) has 4 stages: prep, build, test
and package
- prep stage: doesn't do much, mostly just analysis
- build stage: pulls code from the repo and builds it, builds artifacts
from all binaries built and puts them in gocd at: #{project-name}/build
- test stage, fetches those artifacts stored in #{project-name}/build,
puts them in local build directory and then runs tests, saves test artifact
(not used in build/pkging)
- package stage: fetches artifacts stored at stored in
#{project-name}/build, puts them in local build directory and packages them
up
as i say, most of the time it works great but occasionally a mismatch
between a previously built upstream pipeline (older version) gets mixed in
with a newer pipeline build, and while it compiles when you run something
with the mismatched versions it generates a runtime exception.
as i'm describing this, i believe the cause might be that since we have
multiple agents, a given agent might not always be scheduled to build every
pipeline stage.
so eg if project2 is downstream from project1:
agentA builds project1.verX
agentB builds project2.verX
[project 2 changes]
agentA builds project2.verY
agentA still has project1.verX binaries locally, so these get built against
project2.verY
then when the binaries get packaged up, you get the version mismatch.
it seems like what maybe should occur is that we should have pipelines also
fetch artifacts from all their upstream dependencies (vs just fetching from
their upstream stages, as i described above).
however I'm not certain how to do this with pipeline templates, since we
could have multiple upstream pipelines to fetch from?
so i wanted to add an arbitrary # of 'fetch artifact' tasks to a build
stage's pipeline, and then put all it's upstream pipelines as parameters...
how can i make the pipeline properly fetch all of:
- zero upstream pipelines
- one upstream pipeline
- multiple upstream pipelines
?
Hopefully this makes sense.
My Idea:
- Is there a way i can somehow create a 'upstream-pipeline-list'
parameter, have each pipeline list their upstreams in CSV fasion, and then
have gocd fetch EACH of these upstream pipeline builds prior to actually
building the stage?
To me putting #{upstream-pipeline-list} in a single 'fetch artifact' task
doesn't seem right, since the context of the task seems to only take one
source location, not multiple.
But I misunderstood this before regarding resources, so I figured it was
worth asking.
Or maybe there's some other even more obvious thing I"m missing (outside of
a monorepo, we can't use a monorepo here at least not presently). What is
the 'GOCD WAY' to handle this properly?
appreciate any assistance
-j
--
You received this message because you are subscribed to the Google Groups
"go-cd" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion visit
https://groups.google.com/d/msgid/go-cd/066669fa-b0ae-40ad-bfb2-0cf3e567e641n%40googlegroups.com.