On Monday, 19 August 2013 at 17:35:39 UTC, John Colvin wrote:
On Monday, 19 August 2013 at 17:15:35 UTC, ProgrammingGhost wrote:
On Monday, 19 August 2013 at 11:01:54 UTC, Jacob Carlborg wrote:
The compiler will start compiling the files passed on the command line. It will read the files asynchronously and then lex, parse build an AST and do semantic analyze.

When the semantic analyze is done it will have access to all import declarations. It basically starts the same processes for all these imports, recursively.

The reason for waiting until semantic analyze is done because you can have code looking like this:

mixin("import foo;");

The expansion of the mixin and other similar features are done in the semantic analyze phase.

So everything is parsed once and kept in memory until the compiler finish every source file? Is there any ram problems when compiling large codebases?

Unfortunately, yes, if you give dmd a very large number of files all at once, it will chew through all your free RAM. But dmd does support separate compilation:

$dmd file1.d -c
$dmd file2.d -c
$dmd file1.o file2.o

which alleviates the problem.

My experience with D is limited. Are libraries the same as C libraries? From my understanding the linker figures that part out and the compiler needs a separate file for the definition. If I build a library in D is it the same as a C library or different which includes function definitions?

Sorry if I'm confused I know almost nothing about D. I stick to .NET, java and C++

Libraries in D use the same formats as C/C++ libraries.

Is it possible that if I just try to compile 1 file it could imports enough libraries that import/need the definitions for additional large libraries which in turn also imports everything causing ram issues? I'm sure in practice this will almost never happen. But I don't doubt there are main libraries that use other large libraries and everything imports/uses everything

Reply via email to