Thanks so much for the thoughts, Chris.

My first stab at Appveyor is here: https://github.com/OpenImageIO/oiio/pull/1399
It only partially works: installs a bunch of dependencies, compiles most (all?) 
of the .cpp files, but fails on the link. But it's probably 90% of the way to 
being functional, hopefully somebody else can look at it and suggest the fixes.


> On Apr 8, 2016, at 8:33 PM, Chris Foster <[email protected]> wrote:
> 
> Hi Larry,
> 
> On Thu, Apr 7, 2016 at 3:43 AM, Larry Gritz <[email protected]> wrote:
>> Thanks, Chris!
>> 
>> As kind of a tinkering background project, I have been trying to cobble 
>> together an appveyor.yml for OIIO, taking small steps when I get the chance. 
>> I'm really inexperienced with Windows dev, so I've done it by trial and 
>> error and a lot of google searching for other projects' appveyor files. So I 
>> appreciate one more clean example!
> 
> This is more or less the way I put mine together.  I've done a little
> windows dev, but by no means have a lot of experience.  I've found it
> surprisingly difficult to configure Travis and Appveyor, given what
> initially looks like extensive docs!  Personally I'd be happy if they
> had definitive example configurations with all yml variables in a
> prototypical way, and skip the rest of the docs entirely.
> 
>> The end goal, of course, is to add Windows to our CI regime so that we never 
>> again check in a change that breaks the build on any of the platforms.
> 
> This is my goal too, with a side goal of providing installers for
> anyone who wants to track the bleeding edge on windows but doesn't
> want to build things.
> 
>> While I have you... I'm curious on your (and others') opinions on another 
>> related matter. I want to make it super easy for people to build the 
>> software and not have the dependencies be a hassle. Setting up the TravisCI 
>> builds (which need to download and build most of the dependencies on the 
>> fly) takes a couple big steps in the direction of a "make dependencies" 
>> one-step process. But I'm not sure how the final form of it should be 
>> organized.
> 
> Good question.  I definitely think there's value in a separate "meta
> build system" to build dependencies.  Especially for projects which
> want to support windows, and for projects with unusual or bleeding
> edge dependencies.
> 
>> Should we have separate options for "build nothing, just use what's on the 
>> system", and "see what's on the system, use what's available, download and 
>> build the rest"? Should the latter be further bifurcated into "download and 
>> build missing essentials" versus "download and build everything missing?" 
>> Should there be an option to download and build dependencies that *are* on 
>> the system (maybe you want to force a build of a more modern version, or 
>> experiment with a different set of compiler flags?), and if so, should you 
>> be able to decide that on a package-by-package basis? Should it (for all, or 
>> for packages individually) offer a choice of building from source or 
>> installing a compiled package (e.g., homebrew or apt-get or whatever)?
> 
> IMHO there should be a meta build system which knows how to build all
> of the dependencies, except perhaps for ones which are very large and
> have standardized installers on all platforms (qt, boost?)  I've had
> fairly good success with using cmake for this with
> ExternalProject.cmake.  The only open example I've got of this is
> again in displaz, but for some closed source software I've used it for
> systems with some thirty dependencies or so.  Here's the displaz
> example:
> 
> https://github.com/c42f/displaz/blob/master/thirdparty/external/CMakeLists.txt
> 
> With ExternalProject you should be able to have per-dependency flags
> to turn all dependencies on and off as desired.
> 
>> Should the dependency setup be invoked as a separate script (bash or 
>> whatever is appropriate for each major system)? Or should it happen as part 
>> of the CMake build or as part of the make wrapper that some of us use?
> 
> I've tried making it part of the main CMake.  Unfortunately it doesn't
> work well - at least with ExternalProject_add() - because the meta
> build system has no knowledge of the targets in the third party build
> systems.  To make matters worse, they're not built or installed at
> cmake time, so cmake can't figure out where the libraries and headers
> will be using the usual find_package infrastructure.
> 
>> Is there any advantage to having the dependent projects be set up as git 
>> submodules, or is that pointless and they should just be 'svn co' or 'git 
>> clone' on demand into a temp directory, if they are needed?
> 
> Not sure about the advantages of this, I doubt it helps much unless
> you also intend to hack on the third party libs.
> ExternalProject.cmake can pull from repositories if required.
> 
>> Is there a project with lots of dependencies that people feel have done all 
>> this in "the right way" that we should use as the model for our approach?
> 
> None that I know about, I'd say the approach above is the least clunky
> thing I've found so far, but it's not perfect.  Tracking dependencies
> between third  party libs in the approach above can be a little
> fragile, but ExternalProject has the infrastructure to make it work if
> you're careful.  If set up right, you can also have a properly
> parallel build of all the third party libs using make as the backend.
> This is really helpful if you've got a large list of dependencies.
> 
> Cheers,
> ~Chris
> _______________________________________________
> Oiio-dev mailing list
> [email protected]
> http://lists.openimageio.org/listinfo.cgi/oiio-dev-openimageio.org

--
Larry Gritz
[email protected]


_______________________________________________
Oiio-dev mailing list
[email protected]
http://lists.openimageio.org/listinfo.cgi/oiio-dev-openimageio.org

Reply via email to