On Monday, 19 August 2013 at 11:01:54 UTC, Jacob Carlborg wrote:
The compiler will start compiling the files passed on the command line. It will read the files asynchronously and then lex, parse build an AST and do semantic analyze.

When the semantic analyze is done it will have access to all import declarations. It basically starts the same processes for all these imports, recursively.

The reason for waiting until semantic analyze is done because you can have code looking like this:

mixin("import foo;");

The expansion of the mixin and other similar features are done in the semantic analyze phase.

So everything is parsed once and kept in memory until the compiler finish every source file? Is there any ram problems when compiling large codebases? My experience with D is limited. Are libraries the same as C libraries? From my understanding the linker figures that part out and the compiler needs a separate file for the definition. If I build a library in D is it the same as a C library or different which includes function definitions?

Sorry if I'm confused I know almost nothing about D. I stick to .NET, java and C++

Reply via email to