BCS wrote:
in C# you almost never compile each source file separately, rather you
compile a bunch of sources into an assembly all at once and you provide
the list of other assemblies your code depends on. so the dependency is
on the package level rather than on the file level. this make so much
more sense since each assembly is a self contained unit of
functionality.

That is more or less what I thought it was. Also, that indicates that the design of c# assumes a build model that I think is a bad idea; the "big dumb all or nothing build" where a sub part of a program is either up to date, or rebuilt by recompiling everything in it.

C# has a different compilation model which is what I was saying all along. However I disagree with your assertion that this model is bad. It makes much more sense than the C++/D model. the idea here is that each self contained sub-component is compiled by itself. this self contained component might as well be a single file, nothing in the above prevents this. consider a project with 100 files where you have one specific feature implemented by 4 tightly coupled classes which you put in separate files. each of the files depends on the rest. what's the best compiling strategy here? if you compile each file separately than you parse all 4 files for each object file which is completely redundant and makes little sense since you'll need to recompile all of them anyway because of their dependencies.



Last I heard Re-Sharper is a VS plugin, not an IDE in it's own right, and even if that has changed, it's still an IDE. Even so, my point is Any IDE vs. No IDE, so it dosn't address my point.


My use of the term IDE here is a loose one. let me rephrase:
yes, Re-sharper is a plugin for VS. without it VS provides just text-editing features and I don't consider it an IDE like eclipse is. Re-sharper provides all the features of a real IDE for VS. so, while it's "just" a plugin, it's more important than VS itself.

So DWT depends on DSSS's meta data. That's a design choice of DWT not D. What I'm asserting that that C# projects depending on meta data is a design choice of C# not the project. D project can (even if some don't) be practically designed so that they don't need that meta data where as, I will assert, C# projects, for practical purposes, can't do away with it.

--------------

What I was saying was not specific for DWT but rather that _any_ reasonably big project will use such a system and it's simply not practical to do otherwise. how would you handle a project with a hundred files that takes 30 min. to compile without any tool whatsoever except the compiler itself?


I'm fine with any build system you want to have implemented as long as a tool stack can still be built that works like the current one. That is that it can practically:

- support projects that need no external meta data
- produce monolithic OS native binary executables
- work with the only language aware tool being the compiler

I don't expect it to requiter that projects be done that way and I wouldn't take any issue if a tool stack were built that didn't fit that list. What I /would/ take issue with is the the language (okay, or DMD in particular) were altered to the point that one or more of those *couldn't* be done.


your points are skewed IMO.
> - support projects that need no external meta data
this is only practical for small projects and that works the same way in both languages.

> - produce monolithic OS native binary executables
that is unrelated to our topic. Yes .Net uses byte-code and not native executables. I never said I want this aspect to be brought to D.
> - work with the only language aware tool being the compiler
again, only practical for small-mid projects in both languages.


just to clarify: you _can_ compile C# files one at a time just like you would with C or D, and the is an output format for that that is not an assembly.

Reply via email to