I gain some confidence that my DSL jobs are not catastrophic by building
jobs in test folders first. Of course, this does not help when the job
builds java artifacts that try to overwrite something real in my
Artifactory. And it truly doesn't help when the test job overwrites LATEST
in a Docker ima
It seems to me that the only reasonable way to "integration test" pipeline
changes before deployment is simply to deploy the changes to a job that is
not used for regular work, one that you can manually send events to, to
make sure it does what it needs to do. This is sort of like a "canary
deploy
Hey, thanks for the suggestions! I know about the testing framework and
while I may wind up using it out of necessity I think dry-run is more
valuable than a unit test here. Let me ask you this, could I have a unit
test that I could run locally but that that jenkins could also run first
before
I've just seen your comment in the open PR regarding
https://issues.jenkins-ci.org/browse/JENKINS-27182
> What I want is for my seed job to first output what it's going to change
and wait for user input, then if the user confirms make the changes
IIUC, you would like to mimic a kind of code revi
I opened an SO post (that no one seems to be interested in)
https://stackoverflow.com/questions/59314501/jenkins-job-dsl-some-way-to-do-a-dry-run
I use the jenkins job dsl to create my multibranch pipeline jobs. It works
great with one glaring issue- safely confirming changes _before_ applying