Hi there,

I'm fairly new to Jenkins and CI, my company has dipped a very pragmatic 
toe into the CI water so far. We're looking at investing in updates to a 
very old and creaky build system, a mixture of C#, legacy c++ and other 
bits and pieces (some ruby and perl etc). I'm trying to plan out a new 
build structure that will allow our very large code base to be built in a 
much more incremental way. We already have Jenkins in place, but only for 
running a minimal build on commit to catch build failures early.

For a number of reasons, we'd like at least the top level build scripts to 
be msbuild based, with a small number of modules each generating a nuget 
package that can be referenced by their dependent modules. We would like 
Jenkins to be responsible for publishing the latest nuget for each module 
as it gets built. The build scripts on each dev box can grab the latest 
nuget package off the network as required. So far so good.

Im trying to work out how developing a change (on a dev's box obviously) 
that spans several modules can work nicely in this environment. The issue 
is similar to what is discussed here:

https://groups.google.com/d/msg/jenkinsci-users/wElbo9SINLw/reFtjwwtuTIJ

Our current tree is in a single SVN repo, and we'd like to keep that. I can 
see that a local checkout could have between 1 and all of our top level 
modules. If you want to change one, you check out that one, but if you need 
to update across several, you'll want several of them out, next to each 
other:

src/
    Module1
    Module2 (depends on 1)
    Module3 (depends on 1 and 2)

I can see it being tricky to deal with those nuget packages on the network 
in this scenario. As soon as Module1 has a local change, you need the 
binaries to have been built locally, but if you have no local changes, and 
you update to HEAD in Module1, you have the option of pulling that package 
down to avoid a rebuild of it.

I guess my question is, is it even worth trying to get those optimisations? 
In my scenario, if you only check out Module3 locally, obviously you want 
your package from the network. But if you have pulled down any other 
modules along side, you *might* still want to prepopulate the module with 
the built binaries via the package. I could imagine a top level script with 
a chunk of logic that is checking if a module has local changes, and builds 
locally vs pull the binaries based on this.

Maybe a compromise is that when you checkout a module initially, it pulls 
the binaries off the network, but then an svn up in that module would only 
pull binaries as well if there's no local changes.

I realise that this is only partially related to Jenkins, but this seems to 
be the logical place to find people with experience dealing with these 
types of issues. I hope my explanation makes sense, it seems like it should 
be a common problem but I haven't been able to track down any discussions 
of pros/cons and tools in this space.

Cheers,

Eddie

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to