Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce
I wanted to work on this a little more before announcing it, but 
it seems I'm going to be busy working on trying to get 
unit-threaded into std.experimental so here it is:


http://code.dlang.org/packages/reggae

If you're wondering about the name, it's because it's supposed to 
build on dub.


You might wonder at some of the design decisions. Some of them 
are solutions to weird problems caused by writing build 
descriptions in a compiled language, others I'm not too sure of. 
Should compiler flags be an array of strings or a string? I got 
tired of typing square brackets so it's a string for now.


Please let me know if the API is suitable or not, preferably by 
trying to actually use it to build your software.


Existing dub projects might work by just doing this from a build 
directory of your choice: "reggae -b make /path/to/project". That 
should generate a Makefile (or equivalent Ninja ones if `-b 
ninja` is used) to do what `dub build` usually does. It _should_ 
work for all dub projects, but it doesn't right now. For at least 
a few projects it's due to bugs in `dub describe`. For others it 
might be bugs in reggae, I don't know yet. Any dub.json files 
that use dub configurations extensively is likely to not work.


Features:

. Make and Ninja backends (tup will be the next one)
. Automatically imports dub projects and writes the reggae build 
configuration
. Access to all objects to be built with dub (including 
dependencies) when writing custom builds (reggae does this itself)

. Out-of-tree builds, like CMake
. Arbitrary build rules but pre-built ease-of-use higher level 
targets
. Separate compilation. One file changes, only one file gets 
rebuilt

. Automatic dependency detection for D, C, and C++ source files
. Can build itself (but includes too many object files, another 
`dub describe` bug)


There are several runnable examples in the features directory, in 
the form of Cucumber tests. They include linking D code to C++.


I submitted a proposal to talk about this at DConf but I'll be 
talking about testing instead. Maybe next year? Anyway, destroy!


Atila


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge "no" in the 
description. We must ban C style separate compilation, there is 
simply no way to move forward otherwise. At the very least not 
endorse it in any way.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce
Also I don't see any point in yet another meta build system. The 
very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
Also I don't see any point in yet another meta build system. 
The very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


I was also thinking of a binary backend (producing a binary 
executable that does the build, kinda like what ctRegex does but 
for builds), and also something that just builds it on the spot.


The thing is, I want to get feedback on the API first and 
foremost, and delegating the whole 
do-I-or-do-I-not-need-to-build-it logic to programs that already 
do that (and well) first was the obvious (for me) choice.


Also, Ninja is _really_ fast.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:10:33 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge "no" in the 
description. We must ban C style separate compilation, there is 
simply no way to move forward otherwise. At the very least not 
endorse it in any way.


I understand that. But:

1. One of D's advantages is fast compilation. I don't think that 
means we should should compile everything all the time because we 
can (it's fast anyway!)
2. There are measureable differences in compile-time. While 
working on reggae I got much faster edit-compile-unittest cycles 
because of separate compilation
3. This is valuable feedback. I was wondering what everybody else 
would think. It could be configureable, your "not endorse it in 
any way" notwithstanding. I for one would rather have it compile 
separately
4. CTFE and memory consumption can go through the roof 
(anecdotally anyway, it's never been a problem for me) when 
compiling everything at once.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Ben Boeckel via Digitalmars-d-announce
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via Digitalmars-d-announce 
wrote:
> On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
> > . Separate compilation. One file changes, only one file gets 
> > rebuilt
> 
> This immediately has caught my eye as huge "no" in the 
> description. We must ban C style separate compilation, there is 
> simply no way to move forward otherwise. At the very least not 
> endorse it in any way.

Why? Other than the -fversion=... stuff, what is really blocking this? I
personally find unity builds to not be worth it, but I don't see
anything blocking separate compilation for D if dependencies are set up
properly.

--Ben


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:17:50 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
Also I don't see any point in yet another meta build system. 
The very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


I was also thinking of a binary backend (producing a binary 
executable that does the build, kinda like what ctRegex does 
but for builds), and also something that just builds it on the 
spot.


The thing is, I want to get feedback on the API first and 
foremost, and delegating the whole 
do-I-or-do-I-not-need-to-build-it logic to programs that 
already do that (and well) first was the obvious (for me) 
choice.


Also, Ninja is _really_ fast.


The thing is, it may actually affect API. The way I have 
originally expected it, any legal D code would be allowed for 
build commands instead of pure DSL approach. So instead of 
providing high level abstraction like this:


const mainObj  = Target("main.o",  "dmd -I$project/src -c $in 
-of$out", Target("src/main.d"));
const mathsObj = Target("maths.o", "dmd -c $in -of$out", 
Target("src/maths.d"));
const app = Target("myapp", "dmd -of$out $in", [mainObj, 
mathsObj]);


.. you instead define dependency building blocks in D domain:

struct App
{
enum  path = "./myapp";
alias deps = Depends!(mainObj, mathsObj);

static void generate()
{
import std.process;
enforce(execute([ "dmd",  "-ofmyapp", deps[0].path, 
deps[1].path]).status);

}
}

And provide higher level helper abstractions on top of that, 
tuned for D projects. This is just random syntax I have just 
invented for example of course. It is already possible to write 
decent cross-platform scripts in D - only dependency tracking 
library is missing. But of course that would make using other 
build systems as backends impossible.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via 
Digitalmars-d-announce wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
> . Separate compilation. One file changes, only one file gets 
> rebuilt


This immediately has caught my eye as huge "no" in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


Why? Other than the -fversion=... stuff, what is really 
blocking this? I

personally find unity builds to not be worth it, but I don't see
anything blocking separate compilation for D if dependencies 
are set up

properly.

--Ben


There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old school 
object files are simply not good enough to preserve information 
necessary to produce optimized builds and we are not in position 
to create own metadata + linker combo to circumvent that. This 
also applies to attribute inference which has become a really 
important development direction to handle growing attribute hell.


During last D Berlin Meetup we had an interesting conversation on 
attribute inference topic with Martin Nowak and dropping legacy 
C-style separate compilation seemed to be recognized as 
unavoidable to implement anything decent in that domain.


2)

Ironically, it is just very slow. Those who come from C world got 
used to using separate compilation to speed up rebuilds but it 
doesn't work that way in D. It may look better if you change only 
1 or 2 module but as amount of modified modules grows, 
incremental rebuild quickly becomes _slower_ than full program 
build with all files processed in one go. It can sometimes result 
in order of magnitude slowdown (personal experience).


Difference from C is that repeated imports are very cheap in D 
(you don't copy-paste module content again and again like with 
headers) but at the same time semantic analysis of imported 
module is more expensive (because D semantics are more 
complicated). When you do separate compilation you discard 
already processed imports and repeat it again and again from the 
very beginning for each new compiled file, accumulating huge 
slowdown for application in total.


To get best compilation speed in D you want to process as many 
modules with shared imports at one time as possible. At the same 
time for really big projects it becomes not feasible at some 
point, especially if CTFE is heavily used and memory consumption 
explodes. In that case best approach is partial separate 
compilation - decoupling parts of a program as static libraries 
and doing parallel compilation of each separate library - but 
still compiling each library in one go. That allows to get 
parallelization without doing the same costly work again and 
again.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:40:42 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:17:50 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
Also I don't see any point in yet another meta build system. 
The very point of initial discussion was about getting D only 
cross-platform solution that won't require installing any 
additional software but working D compiler.


I was also thinking of a binary backend (producing a binary 
executable that does the build, kinda like what ctRegex does 
but for builds), and also something that just builds it on the 
spot.


The thing is, I want to get feedback on the API first and 
foremost, and delegating the whole 
do-I-or-do-I-not-need-to-build-it logic to programs that 
already do that (and well) first was the obvious (for me) 
choice.


Also, Ninja is _really_ fast.


The thing is, it may actually affect API. The way I have 
originally expected it, any legal D code would be allowed for 
build commands instead of pure DSL approach. So instead of 
providing high level abstraction like this:


const mainObj  = Target("main.o",  "dmd -I$project/src -c $in 
-of$out", Target("src/main.d"));
const mathsObj = Target("maths.o", "dmd -c $in -of$out", 
Target("src/maths.d"));
const app = Target("myapp", "dmd -of$out $in", [mainObj, 
mathsObj]);


.. you instead define dependency building blocks in D domain:

struct App
{
enum  path = "./myapp";
alias deps = Depends!(mainObj, mathsObj);

static void generate()
{
import std.process;
enforce(execute([ "dmd",  "-ofmyapp", deps[0].path, 
deps[1].path]).status);

}
}

And provide higher level helper abstractions on top of that, 
tuned for D projects. This is just random syntax I have just 
invented for example of course. It is already possible to write 
decent cross-platform scripts in D - only dependency tracking 
library is missing. But of course that would make using other 
build systems as backends impossible.


Well, I took your advice (and one of my acceptance tests is based 
off of your simplified real-work example) and started with the 
low-level any-command-will-do API first. I built the high-level 
ones on top of that. It doesn't seem crazy to me that certain 
builds can only be done by certain backends. The fact that the 
make backend can track C/C++/D dependencies wasn't a given and 
the implementation is quite ugly.


In any case, the Target structs aren't high-level abstractions, 
they're just data. Data that can be generated by any code. Your 
example is basically how the `dExe` rule works: run dmd at 
run-time, collect dependencies and build all the `Target` 
instances. You could have a D backend that outputs (then compiles 
and runs) your example. The "only" problem I can see is execution 
speed.


Maybe I didn't include enough examples.

I also need to think of your example a bit more.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:22:42 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 17:10:33 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge "no" in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


I understand that. But:

1. One of D's advantages is fast compilation. I don't think 
that means we should should compile everything all the time 
because we can (it's fast anyway!)
2. There are measureable differences in compile-time. While 
working on reggae I got much faster edit-compile-unittest 
cycles because of separate compilation
3. This is valuable feedback. I was wondering what everybody 
else would think. It could be configureable, your "not endorse 
it in any way" notwithstanding. I for one would rather have it 
compile separately
4. CTFE and memory consumption can go through the roof 
(anecdotally anyway, it's never been a problem for me) when 
compiling everything at once.


See 
http://forum.dlang.org/post/nhaoahnqucqkjgdwt...@forum.dlang.org


tl; dr: separate compilation support is necessary, but not at 
single module level.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:59:22 UTC, Atila Neves wrote:
Well, I took your advice (and one of my acceptance tests is 
based off of your simplified real-work example) and started 
with the low-level any-command-will-do API first. I built the 
high-level ones on top of that. It doesn't seem crazy to me 
that certain builds can only be done by certain backends. The 
fact that the make backend can track C/C++/D dependencies 
wasn't a given and the implementation is quite ugly.


In any case, the Target structs aren't high-level abstractions, 
they're just data. Data that can be generated by any code. Your 
example is basically how the `dExe` rule works: run dmd at 
run-time, collect dependencies and build all the `Target` 
instances. You could have a D backend that outputs (then 
compiles and runs) your example. The "only" problem I can see 
is execution speed.


Maybe I didn't include enough examples.

I also need to think of your example a bit more.


I may have misunderstood how it works judging only by provided 
examples. Give a me bit more time to investigate actual sources 
and I may reconsider :)


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:55:00 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via 
Digitalmars-d-announce wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
> . Separate compilation. One file changes, only one file 
> gets rebuilt


This immediately has caught my eye as huge "no" in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


Why? Other than the -fversion=... stuff, what is really 
blocking this? I
personally find unity builds to not be worth it, but I don't 
see
anything blocking separate compilation for D if dependencies 
are set up

properly.

--Ben


There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old 
school object files are simply not good enough to preserve 
information necessary to produce optimized builds and we are 
not in position to create own metadata + linker combo to 
circumvent that. This also applies to attribute inference which 
has become a really important development direction to handle 
growing attribute hell.


During last D Berlin Meetup we had an interesting conversation 
on attribute inference topic with Martin Nowak and dropping 
legacy C-style separate compilation seemed to be recognized as 
unavoidable to implement anything decent in that domain.


2)

Ironically, it is just very slow. Those who come from C world 
got used to using separate compilation to speed up rebuilds but 
it doesn't work that way in D. It may look better if you change 
only 1 or 2 module but as amount of modified modules grows, 
incremental rebuild quickly becomes _slower_ than full program 
build with all files processed in one go. It can sometimes 
result in order of magnitude slowdown (personal experience).


Difference from C is that repeated imports are very cheap in D 
(you don't copy-paste module content again and again like with 
headers) but at the same time semantic analysis of imported 
module is more expensive (because D semantics are more 
complicated). When you do separate compilation you discard 
already processed imports and repeat it again and again from 
the very beginning for each new compiled file, accumulating 
huge slowdown for application in total.


To get best compilation speed in D you want to process as many 
modules with shared imports at one time as possible. At the 
same time for really big projects it becomes not feasible at 
some point, especially if CTFE is heavily used and memory 
consumption explodes. In that case best approach is partial 
separate compilation - decoupling parts of a program as static 
libraries and doing parallel compilation of each separate 
library - but still compiling each library in one go. That 
allows to get parallelization without doing the same costly 
work again and again.


Interesting.

It's true that it's not always faster to compile each module 
separately, I already knew that. It seems to me, however, that 
when that's actually the case, the practical difference is 
negligible. Even if 10x slower, the linker will take longer 
anyway. Because it'll all still be under a second. That's been my 
experience anyway. i.e. It's either faster or it doesn't make 
much of a difference.


All I know is I've seen a definite improvement in my 
edit-compile-unittest cycle by compiling modules separately.


How would the decoupling happen? Is the user supposed to 
partition the binary into suitable static libraries? Or is the 
system supposed to be smart enough to figure that out?


Atila




Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread weaselcat via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:55:00 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
On Fri, Apr 03, 2015 at 17:10:31 +, Dicebot via 
Digitalmars-d-announce wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
> . Separate compilation. One file changes, only one file 
> gets rebuilt


This immediately has caught my eye as huge "no" in the 
description. We must ban C style separate compilation, there 
is simply no way to move forward otherwise. At the very least 
not endorse it in any way.


Why? Other than the -fversion=... stuff, what is really 
blocking this? I
personally find unity builds to not be worth it, but I don't 
see
anything blocking separate compilation for D if dependencies 
are set up

properly.

--Ben


There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old 
school object files are simply not good enough to preserve 
information necessary to produce optimized builds and we are 
not in position to create own metadata + linker combo to 
circumvent that. This also applies to attribute inference which 
has become a really important development direction to handle 
growing attribute hell.


Not sure about other people, but I do not care about whole 
program optimization during an edit-compile-run cycle. I just 
want it to compile as fast as possible, and if I change one or 
two files I don't want to have to recompile an entire codebase.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Jacob Carlborg via Digitalmars-d-announce

On 2015-04-03 19:03, Atila Neves wrote:

I wanted to work on this a little more before announcing it, but it
seems I'm going to be busy working on trying to get unit-threaded into
std.experimental so here it is:

http://code.dlang.org/packages/reggae


One thing I noticed immediately (unless I'm mistaken), compiling a D 
project without dependencies is too complicated. It should just be:


$ cd my_d_project
$ reggae

--
/Jacob Carlborg


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread weaselcat via Digitalmars-d-announce

On Friday, 3 April 2015 at 19:07:09 UTC, Jacob Carlborg wrote:

On 2015-04-03 20:06, Atila Neves wrote:


Interesting.

It's true that it's not always faster to compile each module 
separately,
I already knew that. It seems to me, however, that when that's 
actually
the case, the practical difference is negligible. Even if 10x 
slower,
the linker will take longer anyway. Because it'll all still be 
under a
second. That's been my experience anyway. i.e. It's either 
faster or it

doesn't make much of a difference.


I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The makefile 
takes 5.3 seconds, does not including linking since it builds a 
library. The shell script takes 1.3 seconds which include 
compiling unit tests and linking as well.


change one file and see which one is faster with an incremental 
build.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Jacob Carlborg via Digitalmars-d-announce

On 2015-04-03 20:06, Atila Neves wrote:


Interesting.

It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


I just tried compiling one of my project. It has a makefile that does 
separate compilation and a shell script I use for unit testing which 
compiles everything in one go. The makefile takes 5.3 seconds, does not 
including linking since it builds a library. The shell script takes 1.3 
seconds which include compiling unit tests and linking as well.


--
/Jacob Carlborg


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/3/15 10:10 AM, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:

. Separate compilation. One file changes, only one file gets rebuilt


This immediately has caught my eye as huge "no" in the description. We
must ban C style separate compilation, there is simply no way to move
forward otherwise. At the very least not endorse it in any way.


Agreed. D build style should be one invocation per package. -- Andrei


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/3/15 11:06 AM, Atila Neves wrote:


It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


Whoa. The difference is much larger (= day and night) on at least a 
couple of projects at work.



All I know is I've seen a definite improvement in my
edit-compile-unittest cycle by compiling modules separately.

How would the decoupling happen? Is the user supposed to partition the
binary into suitable static libraries? Or is the system supposed to be
smart enough to figure that out?


Smarts would be nice, but in first approximation one package = one 
compilation unit is a great policy.



Andrei



Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 18:06:42 UTC, Atila Neves wrote:
All I know is I've seen a definite improvement in my 
edit-compile-unittest cycle by compiling modules separately.


How would the decoupling happen? Is the user supposed to 
partition the binary into suitable static libraries? Or is the 
system supposed to be smart enough to figure that out?


Ideally both. Build system should be smart enough to group into 
static libraries automatically if user doesn't care (Andrei 
suggestion of one package per library makes sense) but option of 
explicit definition of compilation units is still necessary of 
course.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Dicebot via Digitalmars-d-announce

On Friday, 3 April 2015 at 19:08:58 UTC, weaselcat wrote:
I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The makefile 
takes 5.3 seconds, does not including linking since it builds 
a library. The shell script takes 1.3 seconds which include 
compiling unit tests and linking as well.


change one file and see which one is faster with an incremental 
build.


I don't care if incremental build is 10x faster if full build 
still stays at ~1 second. However I do care (and consider 
unacceptable) if support for incremental builds makes full build 
10 seconds long.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-03 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/3/15 12:07 PM, Jacob Carlborg wrote:

On 2015-04-03 20:06, Atila Neves wrote:


Interesting.

It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


I just tried compiling one of my project. It has a makefile that does
separate compilation and a shell script I use for unit testing which
compiles everything in one go. The makefile takes 5.3 seconds, does not
including linking since it builds a library. The shell script takes 1.3
seconds which include compiling unit tests and linking as well.


Truth be told that's 5.3 seconds for an entire build so the comparison 
is only partially relevant. -- Andrei




Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Atila Neves via Digitalmars-d-announce

On Friday, 3 April 2015 at 19:54:09 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 19:08:58 UTC, weaselcat wrote:
I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The 
makefile takes 5.3 seconds, does not including linking since 
it builds a library. The shell script takes 1.3 seconds which 
include compiling unit tests and linking as well.


change one file and see which one is faster with an 
incremental build.


I don't care if incremental build is 10x faster if full build 
still stays at ~1 second. However I do care (and consider 
unacceptable) if support for incremental builds makes full 
build 10 seconds long.


I'm of the opposite opinion. I don't care if full builds take 1h 
as long as incremental builds are as fast as possible. Why would 
I keep doing full builds? That's like git cloning multiple times. 
What for?


What's clear is that I need to try Andrei's per-package idea, at 
least as an option, if not the default. Having a large D codebase 
to test it on would be nice as well, but I don't know of anything 
bigger than Phobos.


Atila


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Jacob Carlborg via Digitalmars-d-announce

On 2015-04-03 19:54, Dicebot wrote:


2)

Ironically, it is just very slow. Those who come from C world got used
to using separate compilation to speed up rebuilds but it doesn't work
that way in D. It may look better if you change only 1 or 2 module but
as amount of modified modules grows, incremental rebuild quickly becomes
_slower_ than full program build with all files processed in one go. It
can sometimes result in order of magnitude slowdown (personal experience).


BTW, are all the issues with incremental rebuilds solved? I.e. templates 
not outputted to all object files and other problems I can't remember 
right now.


--
/Jacob Carlborg


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Atila Neves via Digitalmars-d-announce
On Friday, 3 April 2015 at 19:45:38 UTC, Andrei Alexandrescu 
wrote:

On 4/3/15 10:10 AM, Dicebot wrote:

On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
. Separate compilation. One file changes, only one file gets 
rebuilt


This immediately has caught my eye as huge "no" in the 
description. We
must ban C style separate compilation, there is simply no way 
to move

forward otherwise. At the very least not endorse it in any way.


Agreed. D build style should be one invocation per package. -- 
Andrei


Just to clarify, reggae has:

1. Low-level building blocks that can be used for pretty much 
anything

2. High-level convenience rules

There's nothing about #1 that forces per-module compilation. It 
doesn't force anything, it's just data definition.


The current implementations of #2, namely dExe and the dub 
integration spit out build systems that compiler per module but 
that can be easily changed or even configured.


Even now it's perfectly possible to define a build system for a D 
project with per package compilation, it'll just take more typing.


Atila


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Atila Neves via Digitalmars-d-announce
On Friday, 3 April 2015 at 19:49:04 UTC, Andrei Alexandrescu 
wrote:

On 4/3/15 11:06 AM, Atila Neves wrote:


It's true that it's not always faster to compile each module 
separately,
I already knew that. It seems to me, however, that when that's 
actually
the case, the practical difference is negligible. Even if 10x 
slower,
the linker will take longer anyway. Because it'll all still be 
under a
second. That's been my experience anyway. i.e. It's either 
faster or it

doesn't make much of a difference.


Whoa. The difference is much larger (= day and night) on at 
least a couple of projects at work.


Even when only one file has changed?

Atila


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/4/15 1:30 AM, Atila Neves wrote:

On Friday, 3 April 2015 at 19:49:04 UTC, Andrei Alexandrescu wrote:

On 4/3/15 11:06 AM, Atila Neves wrote:


It's true that it's not always faster to compile each module separately,
I already knew that. It seems to me, however, that when that's actually
the case, the practical difference is negligible. Even if 10x slower,
the linker will take longer anyway. Because it'll all still be under a
second. That's been my experience anyway. i.e. It's either faster or it
doesn't make much of a difference.


Whoa. The difference is much larger (= day and night) on at least a
couple of projects at work.


Even when only one file has changed?


Yes; due to interdependencies, it's rare that only one file gets 
compiled. -- Andrei




Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Kagamin via Digitalmars-d-announce

On Friday, 3 April 2015 at 17:55:00 UTC, Dicebot wrote:
Complicates whole-program optimization possibilities. Old 
school object files are simply not good enough to preserve 
information necessary to produce optimized builds and we are 
not in position to create own metadata + linker combo to 
circumvent that.


Development builds are usually not whole-program optimized. And 
proper optimizers work with IR and see no problem in separate 
compilation, it's all transparent. Separate compilation is nice 
for RAM too - good in virtualized environment like a CI service.


This also applies to attribute inference which has become a 
really important development direction to handle growing 
attribute hell.


Depends on code style.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Kagamin via Digitalmars-d-announce

On Saturday, 4 April 2015 at 07:44:12 UTC, Atila Neves wrote:
I'm of the opposite opinion. I don't care if full builds take 
1h as long as incremental builds are as fast as possible. Why 
would I keep doing full builds? That's like git cloning 
multiple times. What for?


Full build is important when you do it only once, e.g. if you 
want to try new version of a program and it's not precompiled, 
you'll need to compile it from source and never recompile.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Dicebot via Digitalmars-d-announce

On Saturday, 4 April 2015 at 16:58:23 UTC, Kagamin wrote:

On Friday, 3 April 2015 at 17:55:00 UTC, Dicebot wrote:
Complicates whole-program optimization possibilities. Old 
school object files are simply not good enough to preserve 
information necessary to produce optimized builds and we are 
not in position to create own metadata + linker combo to 
circumvent that.


Development builds are usually not whole-program optimized. And 
proper optimizers work with IR and see no problem in separate 
compilation, it's all transparent. Separate compilation is nice 
for RAM too - good in virtualized environment like a CI service.


We need solutions that can be reasonably implemented with 
existing resources, not perfect solutions. Storing IR in object 
files and using custom linker is "correct" approach for WPO but 
it is currently unaffordable. Add compilation time problems and 
there seems to be no compelling reasons to go that route for now.


This also applies to attribute inference which has become a 
really important development direction to handle growing 
attribute hell.


Depends on code style.


I am not aware of any solutions based on coding style. Can you 
elaborate?


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Dicebot via Digitalmars-d-announce

On Saturday, 4 April 2015 at 07:44:12 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 19:54:09 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 19:08:58 UTC, weaselcat wrote:
I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The 
makefile takes 5.3 seconds, does not including linking since 
it builds a library. The shell script takes 1.3 seconds 
which include compiling unit tests and linking as well.


change one file and see which one is faster with an 
incremental build.


I don't care if incremental build is 10x faster if full build 
still stays at ~1 second. However I do care (and consider 
unacceptable) if support for incremental builds makes full 
build 10 seconds long.


I'm of the opposite opinion. I don't care if full builds take 
1h as long as incremental builds are as fast as possible. Why 
would I keep doing full builds? That's like git cloning 
multiple times. What for?


What's clear is that I need to try Andrei's per-package idea, 
at least as an option, if not the default. Having a large D 
codebase to test it on would be nice as well, but I don't know 
of anything bigger than Phobos.


At work I often switch between dozen of different projects a day 
with small chunk of changes for each. That means that incremental 
builds are never of any value.


Even if you consistently work with the same project it is 
incredibly rare to have a changeset contained in a single module. 
And if there are at least 5 changed modules (including 
inter-dependencies) it becomes long enough already.


As for test codebase - I know that Martin has been testing his GC 
improvements on Higgs (https://github.com/higgsjs/Higgs), could 
be a suitable test subject for you too.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-04 Thread Atila Neves via Digitalmars-d-announce

On Saturday, 4 April 2015 at 19:56:28 UTC, Dicebot wrote:

On Saturday, 4 April 2015 at 07:44:12 UTC, Atila Neves wrote:

On Friday, 3 April 2015 at 19:54:09 UTC, Dicebot wrote:

On Friday, 3 April 2015 at 19:08:58 UTC, weaselcat wrote:
I just tried compiling one of my project. It has a makefile 
that does separate compilation and a shell script I use for 
unit testing which compiles everything in one go. The 
makefile takes 5.3 seconds, does not including linking 
since it builds a library. The shell script takes 1.3 
seconds which include compiling unit tests and linking as 
well.


change one file and see which one is faster with an 
incremental build.


I don't care if incremental build is 10x faster if full build 
still stays at ~1 second. However I do care (and consider 
unacceptable) if support for incremental builds makes full 
build 10 seconds long.


I'm of the opposite opinion. I don't care if full builds take 
1h as long as incremental builds are as fast as possible. Why 
would I keep doing full builds? That's like git cloning 
multiple times. What for?


What's clear is that I need to try Andrei's per-package idea, 
at least as an option, if not the default. Having a large D 
codebase to test it on would be nice as well, but I don't know 
of anything bigger than Phobos.


At work I often switch between dozen of different projects a 
day with small chunk of changes for each. That means that 
incremental builds are never of any value.


Even if you consistently work with the same project it is 
incredibly rare to have a changeset contained in a single 
module. And if there are at least 5 changed modules (including 
inter-dependencies) it becomes long enough already.


As for test codebase - I know that Martin has been testing his 
GC improvements on Higgs (https://github.com/higgsjs/Higgs), 
could be a suitable test subject for you too.


It seems our workflows are very different. Half of the time I 
make changes to a file that only contains unit tests. That's 
always self contained, and doing anything else except for 
recompiling that one file and relinking is going to be slower.


It seems to me that different projects might benefit from 
different compilation strategies. It might just be a case of unit 
tests alongside production code vs in separate files. As 
mentioned before, my experience with per-module compilation was 
usually faster, but I'm going to change the default to be per 
package.


Another cool thing about using reggae to build itself was 
building the unit test and production binaries at the same time. 
I couldn't really do that with dub alone.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-05 Thread Kagamin via Digitalmars-d-announce

On Saturday, 4 April 2015 at 19:59:46 UTC, Dicebot wrote:
We need solutions that can be reasonably implemented with 
existing resources, not perfect solutions. Storing IR in object 
files and using custom linker is "correct" approach for WPO but 
it is currently unaffordable.


Works for me with llvm toolchain.

Add compilation time problems and there seems to be no 
compelling reasons to go that route for now.


A compelling reason is memory consumption and exhaustion.


I am not aware of any solutions based on coding style.


Not sure what you mean, reliance on attribute hell is a coding 
style. You can look at any language, which has no such problem.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-05 Thread Dicebot via Digitalmars-d-announce

On Sunday, 5 April 2015 at 12:17:09 UTC, Kagamin wrote:

On Saturday, 4 April 2015 at 19:59:46 UTC, Dicebot wrote:
We need solutions that can be reasonably implemented with 
existing resources, not perfect solutions. Storing IR in 
object files and using custom linker is "correct" approach for 
WPO but it is currently unaffordable.


Works for me with llvm toolchain.


Unless LDC does some D specific WPO magic I am not aware of this 
is not what your original statement was about.



I am not aware of any solutions based on coding style.


Not sure what you mean, reliance on attribute hell is a coding 
style. You can look at any language, which has no such problem.


Erm. Either it is coding style issue or a language issue. Pick 
one. Only coding style for D I am aware of that deals with 
attribute hell is "ignore most attributes" which is hardly 
solution. Please give any specific example to back your point.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-05 Thread Kagamin via Digitalmars-d-announce

On Sunday, 5 April 2015 at 12:22:15 UTC, Dicebot wrote:
Unless LDC does some D specific WPO magic I am not aware of 
this is not what your original statement was about.


llvm does normal WPO in a sense that compiled code is not opaque.

Erm. Either it is coding style issue or a language issue. Pick 
one. Only coding style for D I am aware of that deals with 
attribute hell is "ignore most attributes" which is hardly 
solution.


The problem can't be solved for coding styles, which rely on 
attribute hell, I only said the problem depends on coding style.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-05 Thread Andrei Alexandrescu via Digitalmars-d-announce

On 4/4/15 12:56 PM, Dicebot wrote:


Even if you consistently work with the same project it is incredibly
rare to have a changeset contained in a single module. And if there are
at least 5 changed modules (including inter-dependencies) it becomes
long enough already.


That's my experience as well. -- Andrei


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-06 Thread Sergei Nosov via Digitalmars-d-announce

On Sunday, 5 April 2015 at 00:22:35 UTC, Atila Neves wrote:
It seems to me that different projects might benefit from 
different compilation strategies. It might just be a case of 
unit tests alongside production code vs in separate files. As 
mentioned before, my experience with per-module compilation was 
usually faster, but I'm going to change the default to be per 
package.


I want to also share my experience in that regard.

When I was writing a vibe.d based application, I used dub as a 
build system, which sends everything in one go. My application 
was just a couple of files, so I was, practically, just building 
vibe every time.


I was developing the application on a desktop with 4 Gb RAM and 
everything was fine (albeit I was missing the "progress bar" of 
files in progress provided by ninja/make).


But then it was time to deploy the app, and I bought a 1 GB RAM 
virtual node from Linode. After executing dub it told me "Out of 
memory" and exited. And there was nothing I could do.


So I took the only option I saw - I switched to CMake (modified 
for working with D) to provide me a separate compilation build 
(ninja-based) and swore to never again.


I understand the reasoning behind both separate and "throw in 
everything" compilation strategies. And I also understand the 
pros of a middle-ground solution (like, per-package one), which 
is probably the way D will go. But this area seems kind of gray 
to me (like, in my case the "per-package" solution wouldn't work 
either, if I understand it correctly).


So, personally, I will probably stick to separate compilation, 
until I see that:


- The pros of "batch" compilation are clear and, desirably, 
obvious. At the moment it seems to me (seems to me), that faster 
compilation and attribute inference just don't have a significant 
impact.
- There's a way to fine tune between "separate" and "throw in 
everything" compilation if necessary.


Thanks!



Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-07 Thread Dicebot via Digitalmars-d-announce

On Sunday, 5 April 2015 at 12:50:52 UTC, Kagamin wrote:

On Sunday, 5 April 2015 at 12:22:15 UTC, Dicebot wrote:
Unless LDC does some D specific WPO magic I am not aware of 
this is not what your original statement was about.


llvm does normal WPO in a sense that compiled code is not 
opaque.


And I have never been speaking about "normal WPO", only about one 
specific to D semantics.


Erm. Either it is coding style issue or a language issue. Pick 
one. Only coding style for D I am aware of that deals with 
attribute hell is "ignore most attributes" which is hardly 
solution.


The problem can't be solved for coding styles, which rely on 
attribute hell, I only said the problem depends on coding style.


This sentence probably means something but I were not able to 
figure it out even after re-reading it several times. "coding 
style which relies on attribute hell", what kind of weird beast 
that is?


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-07 Thread Dicebot via Digitalmars-d-announce

On Monday, 6 April 2015 at 11:29:20 UTC, Sergei Nosov wrote:

On Sunday, 5 April 2015 at 00:22:35 UTC, Atila Neves wrote:
It seems to me that different projects might benefit from 
different compilation strategies. It might just be a case of 
unit tests alongside production code vs in separate files. As 
mentioned before, my experience with per-module compilation 
was usually faster, but I'm going to change the default to be 
per package.


I want to also share my experience in that regard.

...


See, the problem with this approach is that you can trivially get 
out of 1GB of memory with DMD even when compiling single module, 
all you need is to do enough compile-time magic. Separate 
compilation here delays the issue but does not actually solve it.


If any effort is to be put into supporting this scenario 
(on-server compilation), it is better to be put in reducing 
actual memory hog of compiler, not supporting another workaround.


Also you can still achieve the similar profile by splitting your 
project in small enough static libraries, so it is not completely 
out of question.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-07 Thread Kagamin via Digitalmars-d-announce

On Tuesday, 7 April 2015 at 08:28:08 UTC, Dicebot wrote:
And I have never been speaking about "normal WPO", only about 
one specific to D semantics.


AFAIK, hypothetical D-specific optimizations were never 
implemented (like elision of pure calls and optimization of 
immutable data). But they work on signature level, so they 
shouldn't be affected by separate compilation in any way.


This sentence probably means something but I were not able to 
figure it out even after re-reading it several times. "coding 
style which relies on attribute hell", what kind of weird beast 
that is?


I suppose your coding style can be an example, you wouldn't be 
interested in attribute hell otherwise.


Re: Reggae v0.0.5 super alpha: A build system in D

2015-04-07 Thread Sergei Nosov via Digitalmars-d-announce

On Tuesday, 7 April 2015 at 08:25:02 UTC, Dicebot wrote:
See, the problem with this approach is that you can trivially 
get out of 1GB of memory with DMD even when compiling single 
module, all you need is to do enough compile-time magic. 
Separate compilation here delays the issue but does not 
actually solve it.


Yeah, absolutely agree. But at the moment separate compilation is 
the most "forgiving" one. Like, if it doesn't work - anything 
else won't work either. And given that personally I don't 
recognize the (possibly) increased compilation time as an issue, 
it's the solution that works for me.


If any effort is to be put into supporting this scenario 
(on-server compilation), it is better to be put in reducing 
actual memory hog of compiler, not supporting another 
workaround.


Agreed, too. The whole "forget about frees" approach sounds a 
little too controversial to me. Especially, after I have faced 
the dark side of it. So, I'm all for improving in that regard. 
But it seems like it's not recognized as a (high-priority) issue 
at the moment. So, we (the users) have to live with that.


Also you can still achieve the similar profile by splitting 
your project in small enough static libraries, so it is not 
completely out of question.


As I described, my project was just a couple of files. Building 
vibe.d was the actual problem. I don't think it is feasible to 
expect that a user of a library will start splitting it into 
"small enough libraries", when faced with this problem. A more 
structured approach is needed.