Tom S wrote:
Walter Bright wrote:
Tom S wrote:
Walter Bright wrote:
I don't really understand why the -lib approach is not working for your needs.

I'm not sure what you mean by "the -lib approach". Just how do you exactly apply it to incremental compilation? If my project has a few hundred modules and I change just one line in one function, I don't want to rebuild it with -lib all again. I thought you were referring to the proof-of-concept incremental build tool I posted yesterday which used -multiobj, as it should be possible to optimize it using -lib... I just haven't tried that yet.

You only have to build one source file with -lib, not all of them.

So you mean compiling each file separately?

Yes. Or a subset of the files.

That's only an option if we turn to the C/C++ way of doing projects - using .di files just like C headers - *everywhere*. Only then can changes in .d files be localized to just one module and compiled quickly. Java and C# do without header files because (to my knowledge) they have no means of changing what's compiled based on the contents of an imported module (basically they lack metaprogramming).

So we could give up and do it the C/C++ way with lots of duplicated code in headers (C++ is better here with allowing you to only implement methods of a class in the .cpp file instead of rewriting the complete class and filling in member functions, like the .d/.di approach would force) or we might have an incremental build tool that doesn't suck.

This is the picture as I see it:

* I need to rebuild all modules that import the changed modules, because some code in them might evaluate differently (static ifs on the imported modules, for instance - I explained that in my first post in this topic).

* I need to compile them all at once, because compiling each of them in succession yields massively long compile times.

* With your suggestion of using -lib, I assumed that you were suggesting building all these modules at once into a lib and then figuring out what to do with their object files one by one.

* Some object files need to be extracted because otherwise module ctors won't be linked into the executable.

* As this is incremental compilation, there will be object files from the previous build, some of which should not be linked, because that would cause multiple definition errors.

* The obsoleted object files can't be simply removed, since they might contain comdat symbols needed by some objects outside of the newly compiled set (I gave an example in my first post, but can provide actual D code that illustrates this issue). Thus they have to be moved into a lib and only pulled into linking on demand.

That's how my experimental build tool maps to the "-lib approach".

What you can try is creating a database that is basically a lib (call it A.lib) of all the modules compiled with -lib. Then recompile all modules that depend on changed modules in one command, also with -lib, call it B.lib. Then for all the obj's in B, replace the corresponding ones in A.

Reply via email to