Re: A brief survey of build tools, focused on D

2018-12-17 Thread Olivier FAURE via Digitalmars-d-announce

On Sunday, 16 December 2018 at 00:17:55 UTC, Paul Backus wrote:
There's something important you're glossing over here, which is 
that, in the general case, there's no single obvious or natural 
way to compose two DAGs together.


For example: suppose project A's DAG has two "output" vertices 
(i.e., they have no outgoing edges), one corresponding to a 
"debug" build and one corresponding to a "release" build. Now 
suppose project B would like to depend on project A. For this 
to happen, our hypothetical DAG import function needs to add 
one or more edges that connect A's DAG to B's DAG. The question 
is, how many edges, and which vertices should these edges 
connect?


That doesn't seem right.

Surely you could write

externalDependencies = [ someSubmodule.release ]

in your language-specific build tool, and have it convert to an 
equivalent import edge targeting the correct vertice in the 
standardized dependency graph?


It might be inconvenient in some cases (eg you have to manually 
tell your tool to import someSubmodule.release in release mode 
and someSubmodule.debug in debug mode), but it would still be 
infinitely more convenient and elegant than the current "multiple 
incompatible build tools per language, and shell/make scripts to 
link them together" paradigm.


Especially with the increasing usability of WebAsm, it would be 
nice to have a standard build model to link multiple modules in 
different languages together into a single wasm binary.


A standardized DAG model would also help making a project-wide 
equivalent to the Language Server Protocol. (I'm not convinced by 
BSP)


Re: A brief survey of build tools, focused on D

2018-12-15 Thread Neia Neutuladh via Digitalmars-d-announce
On Sun, 16 Dec 2018 00:17:55 +, Paul Backus wrote:
> On Wednesday, 12 December 2018 at 22:41:50 UTC, H. S. Teoh wrote:
>> It's time we came back to the essentials.  Current monolithic build
>> systems ought to be split into two parts: [...]
> You're missing (0) the package manager, which is probably the biggest
> advantage "monolothic" build tools like dub, cargo, and npm have
> compared to language-agnostic ones like make.

If I were to make a new build tool and wanted package manager integration, 
I'd choose Maven as the backend. This would no doubt be more frustrating 
than just making my own, but there would hopefully be fewer bugs on the 
repository side.

(I might separately make my own Maven-compatible backend.)

> There's something important you're glossing over here, which is that, in
> the general case, there's no single obvious or natural way to compose
> two DAGs together.

You do it like Bazel.

In Bazel, you have a WORKSPACE file at the root of your project. It 
describes, among other things, what dependencies you have. This might, for 
instance, be a git URL and revision. All this does is expose that 
package's build rules to you.

Separately, you have build rules. Each build rule can express a set of 
dependencies on other build rules. There's no difference between depending 
on a rule that your own code defines and depending on one from an external 
dependency.

It might be appropriate to have a hint on DAG nodes saying that this is 
the default thing that you should probably depend on if you're depending 
on the package. A lot of projects only produce one artifact for public 
consumption.


Re: A brief survey of build tools, focused on D

2018-12-15 Thread Paul Backus via Digitalmars-d-announce

On Wednesday, 12 December 2018 at 22:41:50 UTC, H. S. Teoh wrote:
It's time we came back to the essentials.  Current monolithic 
build systems ought to be split into two parts:


(1) Dependency detector / DAG generator.  Do whatever you need 
to do here: dub-style scanning of .d imports, scan directories 
for .d files, tup-style instrumenting of the compiler, type it 
out yourself, whatever. The resulting DAG is stored in a 
standard format in a standard location in the source tree.


(2) Build executor: read in a standard DAG and employ a 
standard topological walk to transform inputs into outputs.


You're missing (0) the package manager, which is probably the 
biggest advantage "monolothic" build tools like dub, cargo, and 
npm have compared to language-agnostic ones like make.


Granted, there's no reason dub couldn't function solely as a 
package manager and DAG generator, while leaving the actual build 
execution to some other tool.


Every project should publish the DAG in a standard format in a 
standard location.


You mean a Makefile? :^)

Now of course, in real-life implementation, there will be many 
more details that need to be taken care of.  But these are the 
essentials: standard DAG representation, and a standard DAG 
import function.


There's something important you're glossing over here, which is 
that, in the general case, there's no single obvious or natural 
way to compose two DAGs together.


For example: suppose project A's DAG has two "output" vertices 
(i.e., they have no outgoing edges), one corresponding to a 
"debug" build and one corresponding to a "release" build. Now 
suppose project B would like to depend on project A. For this to 
happen, our hypothetical DAG import function needs to add one or 
more edges that connect A's DAG to B's DAG. The question is, how 
many edges, and which vertices should these edges connect?


If we have out-of-band knowledge about A and B--for example, if 
we know they're both dub packages--then this is a relatively 
straightforward question to answer. (Though not completely 
trivial; see for example how dub handles the -unittest flag.) But 
if A and B can be absolutely *any* kind of project, written in 
any language, using any build tools, and able to produce any 
number of "outputs," there's no way to guarantee you've wired up 
the DAGs correctly short of doing it by hand.


One way to overcome this problem is to restrict projects to the 
subset of DAGs that have only one "output" vertex--but then, of 
course, you have to say goodbye to convenient debug builds, test 
builds, cross compiling, etc. So the cure may be worse than the 
disease.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Neia Neutuladh via Digitalmars-d-announce

On Wednesday, 12 December 2018 at 22:41:50 UTC, H. S. Teoh wrote:
And here is the crux of my rant about build systems (earlier in 
this thread).  There is no *technical reason* why build systems 
should be constricted in this way. Today's landscape of 
specific projects being inextricably tied to a specific build 
system is completely the wrong approach.


You could reduce all this language-specific stuff to a way to 
generate a description of what needs to be built and what 
programs are suggested for doing it. This is quite a layer of 
indirection, and that means more work. "I can do less work" is a 
technical reason.


Ensuring that your output is widely usable is also extra work.

There is also a psychological reason: when you're trying to solve 
a set of problems and you are good at code, it's easy to tunnel 
vision into writing all the code yourself. It can even, 
sometimes, be easier to write that new code than to figure out 
how to use something that already exists (if you think you can 
gloss over a lot of edge cases or support a lot fewer pieces, for 
instance).


This is probably why Dub has its own repository instead of using 
Maven.


Seriously, building a lousy software project is essentially 
traversing a DAG of inputs and actions in topological order.  
The algorithms have been known since decades ago, if not 
longer, and there is absolutely no valid reason why we cannot 
import arbitrary sub-DAGs and glue it to the main DAG, and have 
everything work with no additional effort, regardless of where 
said sub-DAGs came from.  It's just a bunch of nodes and 
labelled edges, guys!  All the rest of the complications and 
build system dependencies and walled gardens are extraneous and 
completely unnecessary baggage imposed upon a straightforward 
DAG topological walk that any CS grad could write in less than 
a day.  It's ridiculous.


If any CS grad student could write it in a day, you could say 
that having a generic DAG isn't useful or interesting. That makes 
it seem pretty much useless to pull that out into a separate 
software project, and that's a psychological barrier.


Re: A brief survey of build tools, focused on D

2018-12-12 Thread H. S. Teoh via Digitalmars-d-announce
On Wed, Dec 12, 2018 at 02:52:09PM -0700, Jonathan M Davis via 
Digitalmars-d-announce wrote:
[...]
> I would think that to be fully flexible, dub would need to abstract
> things a bit more, maybe effectively using a plugin system for builds
> so that it's possible to have a dub project that uses dub for pulling
> in dependencies but which can use whatever build system works best for
> your project (with the current dub build system being the default).
> But of course, even if that is made to work well, it then introduces
> the problem of random dub projects then needing 3rd party build
> systems that you may or may not have (which is one of the things that
> dub's current build system mostly avoids).

And here is the crux of my rant about build systems (earlier in this
thread).  There is no *technical reason* why build systems should be
constricted in this way. Today's landscape of specific projects being
inextricably tied to a specific build system is completely the wrong
approach.

Projects should not be tied to a specific build system.  Instead,
whatever build tool the author uses to build the project should export a
universal description of how to build it, in a standard format that can
be imported by any other build system. This description should be a
fully general DAG, that specifies all inputs, all outputs (including
intermediate ones), and the actions required to get from input to
output.

Armed with this build description, any build system should be able to
import as a dependency any project built with any other build system,
and be able to successfully build said dependency without even knowing
what build system was originally used to build it or what build system
it is "intended" to be built with.  I should be able to import a Gradle
project, a dub project, and an SCons project as dependencies, and be
able to use make to build everything. And my downstream users ought to
be able to build my project with tup, or any other build tool they
choose, without needing to care that I used make to build my project.

Seriously, building a lousy software project is essentially traversing a
DAG of inputs and actions in topological order.  The algorithms have
been known since decades ago, if not longer, and there is absolutely no
valid reason why we cannot import arbitrary sub-DAGs and glue it to the
main DAG, and have everything work with no additional effort, regardless
of where said sub-DAGs came from.  It's just a bunch of nodes and
labelled edges, guys!  All the rest of the complications and build
system dependencies and walled gardens are extraneous and completely
unnecessary baggage imposed upon a straightforward DAG topological walk
that any CS grad could write in less than a day.  It's ridiculous.


> On some level, dub is able to do as well as it does precisely because
> it's able to assume a bunch of stuff about D projects which is true
> the vast majority of the time, and the more it allows projects that
> don't work that way, the worse dub is going to work as a general tool,
> because it increasingly opens up problems with regards to whether you
> have the right tools or environment to build a particular project when
> using it as a dependency. However, if we don't figure out how to make
> it more flexible, then certain classes of projects really aren't going
> to work well with dub.  That's less of a problem if the project is not
> for a library (and thus does not need to be a dub package so that
> other packages can pull it in as a dependency) and if dub provides a
> good way to just make libraries available as dependencies rather than
> requiring the the ultimate target be built with dub, but even then, it
> doesn't solve the problem when the target _is_ a library (e.g. what if
> it were for wrapping a C or C++ library and needed to do a bunch of
> extra code steps for code generation and needed multiple build steps).

Well exactly, again, the monolithic approach to building software is the
wrong approach, and leads to arbitrary and needless limitations of this
sort.  DAG generation should be decoupled from build execution.  You can
use whatever tool or fancy algorithm you want to generate the lousy DAG,
but once generated, all you have to do is to export it in a standard
format, then any arbitrary number of build executors can read the
description and run it.

Again I say: projects should not be bound to this or that build system.
Instead, they should export a universal build description in a standard
format.  Whoever wants to depend on said projects can simply import the
build description and it will Just Work(tm). The build executor will
know exactly how to build the dependency independently of whatever fancy
tooling the upstream author may have used to generate the DAG.


> So, I don't know. Ultimately, what this seems to come down to is that
> all of the stuff that dub does to make things simple for the common
> case make it terrible for complex cases, but making it work well for
> 

Re: A brief survey of build tools, focused on D

2018-12-12 Thread Jonathan M Davis via Digitalmars-d-announce
On Wednesday, December 12, 2018 1:33:39 PM MST H. S. Teoh via Digitalmars-d-
announce wrote:
> On Wed, Dec 12, 2018 at 10:38:55AM +0100, Sönke Ludwig via Digitalmars-d-
announce wrote:
> > Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
> > > Does dub support the following scenario?
>
> [...]
>
> > This will currently realistically require invoking an external tool
> > such as make through a pre/post-build command (although it may
> > actually be possible to hack this together using sub packages, build
> > commands, and string import paths for the file dependencies). Most
> > notably, there is a directive missing to specify arbitrary files as
> > build dependencies.
>
> I see.  I think this is a basic limitation of dub's design -- it assumes
> a certain (common) compilation model of sources to (single) executable,
> and everything else is only expressible in terms of larger abstractions
> like subpackages.  It doesn't really match the way I work, which I guess
> explains my continuing frustration with using it.  I think of my build
> processes as a general graph of arbitrary input files being converted by
> arbitrary operations (not just compilation) into arbitrary output files.
> When I'm unable to express this in a simple way in my build spec, or
> when I'm forced to use tedious workarounds to express what in my mind
> ought to be something very simple, it distracts me from my focusing on
> my problem domain, and results in a lot of lost time/energy and
> frustration.

What you're describing sounds like it would require a lot of extra machinery
in comparison to how dub is designed to work. dub solves the typical use
case of building a single executable or library (which is what the vast
majority of projects do), and it removes the need to specify much of
anything to make that work, making it fantastic for the typical use case but
causing problems for any use cases that have more complicated needs. I
really don't see how doing much of anything other than building a single
executable or library from a dub project is going to result in anything
other than frustration from the tool even if you can make it work. By the
very nature of what you'd be trying to do, you'd be constantly trying to
work around how dub is designed to work. dub can do more thanks to
subprojects and some of the extra facilities it has for running stuff before
or after the build, but all of that sort of stuff has to work around dub's
core design, making it generally awkward to use, whereas to do something
more complex, at some point, what you really want is basically a build
script (albeit maybe with some extra facilities to properly detect whether
certain phases of the build can be skipped).

I would think that to be fully flexible, dub would need to abstract things a
bit more, maybe effectively using a plugin system for builds so that it's
possible to have a dub project that uses dub for pulling in dependencies but
which can use whatever build system works best for your project (with the
current dub build system being the default). But of course, even if that is
made to work well, it then introduces the problem of random dub projects
then needing 3rd party build systems that you may or may not have (which is
one of the things that dub's current build system mostly avoids).

On some level, dub is able to do as well as it does precisely because it's
able to assume a bunch of stuff about D projects which is true the vast
majority of the time, and the more it allows projects that don't work that
way, the worse dub is going to work as a general tool, because it
increasingly opens up problems with regards to whether you have the right
tools or environment to build a particular project when using it as a
dependency. However, if we don't figure out how to make it more flexible,
then certain classes of projects really aren't going to work well with dub.
That's less of a problem if the project is not for a library (and thus does
not need to be a dub package so that other packages can pull it in as a
dependency) and if dub provides a good way to just make libraries available
as dependencies rather than requiring the the ultimate target be built with
dub, but even then, it doesn't solve the problem when the target _is_ a
library (e.g. what if it were for wrapping a C or C++ library and needed to
do a bunch of extra code steps for code generation and needed multiple build
steps).

So, I don't know. Ultimately, what this seems to come down to is that all of
the stuff that dub does to make things simple for the common case make it
terrible for complex cases, but making it work well for complex cases would
almost certainly make it _far_ worse for the common case. So, I don't know
that we really want to be drastically changing how dub works, but I do think
that we need to make it so that more is possible with it (even if it's more
painful, because it's doing something that goes against the typical use
case).

The most obvious thing that I can think of is 

Re: A brief survey of build tools, focused on D

2018-12-12 Thread H. S. Teoh via Digitalmars-d-announce
On Wed, Dec 12, 2018 at 10:38:55AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:
> Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
> > [...]
> > Wait, what does --parallel do if it doesn't compile multiple files
> > at once?
> 
> It currently only works when building with `--build-mode=singleFile`,
> so compiling individual files in parallel instead of compiling chunks
> of files in parallel, which would be the ideal.

Ah, I see.  But that should be relatively easy to fix, right?


[...]
> There are the three directives sourcePaths, sourceFiles and
> excludedSourceFiles (the latter two supporting wildcard expressions)
> to control the list of files. Once an explicit sourcePaths directive
> is given, the folder that is possibly detected by default
> ("source"/"src") is also skipped. They are documented in the package
> format specs ([1], [2]).

Thanks for the info.


> > Also, you refer to "the output binary". Does that mean I cannot
> > generate multiple executables? 'cos that's a showstopper for me.
> 
> Compiling multiple executables currently either requires multiple
> invocations (e.g. with different configurations or sub packages
> specified), or a targetType "none" package that has one dependency per
> executable - the same configuration/architecture applies to all of
> them in that case. If they are actually build dependencies, a possible
> approach is to invoke dub recursively inside of a preBuildCommand.

Unfortunately, that is not a practical solution for me.  Many of my
projects have source files that are generated by utilities that are
themselves D code that needs to be compiled (and run) as part of the
build.  I suppose in theory I could separate them into subpackages, and
factor out the common code shared between these utilities and the main
executable(s), but that is far too much work for something that IMO
ought to be very simple -- since most of the utilities are single-file
drivers with a small number of imports of some shared modules. Creating
entire subpackages for each of them just seems excessive, esp. during
development where the set of utilities / generated files may change a
lot.  Creating/deleting a subpackage every time is just too much work
for little benefit.

Also, does dub correctly support the case where some .d files are
generated by said utilities (which would be dub subpackages, if we
hypothetically went with that setup), but the output may change
depending on the contents of some input data/config files? I.e., if I
change a data file and run dub again, it ought to re-run the codegen
tool and then recompile the main executable that contains the changed
code.  This is a pretty important use-case for me, since it's kinda the
whole point of having a codegen tool.

Compiling the same set of sources for multiple archs (with each arch
possibly entailing a separate list of source files) is kinda a special
case for my current Android project; generally I don't really need
support for this. But solid support for codegen that properly percolates
changes from input data down to recompiling executables is must-have for
me.  Not being able to do this in the most efficient way possible would
greatly hamper my productivity.


> But what I meant is that there is for example currently no way to
> customize the output binary base name ("targetName") and directory
> ("targetPath") depending on the build type.

But this shouldn't be difficult to support, right?  Though I don't
particularly need this feature -- for the time being.


[...]
> > Does dub support the following scenario?
[...]
> This will currently realistically require invoking an external tool
> such as make through a pre/post-build command (although it may
> actually be possible to hack this together using sub packages, build
> commands, and string import paths for the file dependencies). Most
> notably, there is a directive missing to specify arbitrary files as
> build dependencies.

I see.  I think this is a basic limitation of dub's design -- it assumes
a certain (common) compilation model of sources to (single) executable,
and everything else is only expressible in terms of larger abstractions
like subpackages.  It doesn't really match the way I work, which I guess
explains my continuing frustration with using it.  I think of my build
processes as a general graph of arbitrary input files being converted by
arbitrary operations (not just compilation) into arbitrary output files.
When I'm unable to express this in a simple way in my build spec, or
when I'm forced to use tedious workarounds to express what in my mind
ought to be something very simple, it distracts me from my focusing on
my problem domain, and results in a lot of lost time/energy and
frustration.


[...]
> BTW, my plan for the Android part of this was to add support for
> plugins (fetchable from the registry, see [3] for a draft) that handle
> the details in a centralized manner instead of having to put that
> knowledge into the build recipe of each 

Re: A brief survey of build tools, focused on D

2018-12-12 Thread Andre Pany via Digitalmars-d-announce
On Wednesday, 12 December 2018 at 09:38:55 UTC, Sönke Ludwig 
wrote:
Most notably, there is a directive missing to specify arbitrary 
files as build dependencies.


I am working on a pull request:
https://github.com/andre2007/dub/commit/97161fb352dc1237411e2e7010447f8a9e817d48

Productive implementation is finished.
Only tests are missing.

Kind regards
André


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Sönke Ludwig via Digitalmars-d-announce

Am 12.12.2018 um 15:53 schrieb Atila Neves:

On Wednesday, 12 December 2018 at 09:38:55 UTC, Sönke Ludwig wrote:

Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:

[...]


The main open point right now AFAICS is to make --parallel work with
the multiple-files-at-once build modes for machines that have enough
RAM. This is rather simple, but someone has to do it. But apart from
that, I think that the current state is relatively fine form a
performance point of view.


Wait, what does --parallel do if it doesn't compile multiple files at
once?


It currently only works when building with `--build-mode=singleFile`, 
so compiling individual files in parallel instead of compiling chunks 
of files in parallel, which would be the ideal.


If by "the ideal" you mean "compile the fastest", then you don't want to 
compile single files in parallel. I measured across multiple projects, 
and compiling per package (in the D sense, not the dub one) was fastest. 
Which is why it's the default with reggae.




The sentence was ambiguous, but that's what I meant!


Re: A brief survey of build tools, focused on D

2018-12-12 Thread Atila Neves via Digitalmars-d-announce
On Wednesday, 12 December 2018 at 09:38:55 UTC, Sönke Ludwig 
wrote:

Am 11.12.2018 um 20:46 schrieb H. S. Teoh:
On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:

[...]

The main open point right now AFAICS is to make --parallel 
work with
the multiple-files-at-once build modes for machines that have 
enough
RAM. This is rather simple, but someone has to do it. But 
apart from

that, I think that the current state is relatively fine form a
performance point of view.


Wait, what does --parallel do if it doesn't compile multiple 
files at

once?


It currently only works when building with 
`--build-mode=singleFile`, so compiling individual files in 
parallel instead of compiling chunks of files in parallel, 
which would be the ideal.


If by "the ideal" you mean "compile the fastest", then you don't 
want to compile single files in parallel. I measured across 
multiple projects, and compiling per package (in the D sense, not 
the dub one) was fastest. Which is why it's the default with 
reggae.




Re: A brief survey of build tools, focused on D

2018-12-12 Thread Sönke Ludwig via Digitalmars-d-announce

Am 11.12.2018 um 20:46 schrieb H. S. Teoh:

On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:
[...]


The main open point right now AFAICS is to make --parallel work with
the multiple-files-at-once build modes for machines that have enough
RAM. This is rather simple, but someone has to do it. But apart from
that, I think that the current state is relatively fine form a
performance point of view.


Wait, what does --parallel do if it doesn't compile multiple files at
once?


It currently only works when building with `--build-mode=singleFile`, so 
compiling individual files in parallel instead of compiling chunks of 
files in parallel, which would be the ideal.

Then it requires a specific source layout, with incomplete /
non-existent configuration options for alternatives.  Which makes it
unusable for existing code bases.  Unacceptable.


You can define arbitrary import/source directories and list (or
delist) source files individually if you want. There are restrictions
on the naming of the output binary, though, is that what you mean?


Is this documented? I couldn't find any info on it the last time I
looked.


There are the three directives sourcePaths, sourceFiles and 
excludedSourceFiles (the latter two supporting wildcard expressions) to 
control the list of files. Once an explicit sourcePaths directive is 
given, the folder that is possibly detected by default ("source"/"src") 
is also skipped. They are documented in the package format specs ([1], [2]).




Also, you refer to "the output binary". Does that mean I cannot
generate multiple executables? 'cos that's a showstopper for me.


Compiling multiple executables currently either requires multiple 
invocations (e.g. with different configurations or sub packages 
specified), or a targetType "none" package that has one dependency per 
executable - the same configuration/architecture applies to all of them 
in that case. If they are actually build dependencies, a possible 
approach is to invoke dub recursively inside of a preBuildCommand.


But what I meant is that there is for example currently no way to 
customize the output binary base name ("targetName") and directory 
("targetPath") depending on the build type.



Worst of all, it does not support custom build actions, which is a
requirement for many of my projects.  It does not support polyglot
projects. It either does not support explicit control over exact
build commands, or any such support is so poorly documented it might
as well not exist.  This is not only unacceptable, it is a
show-stopper.


Do you mean modifying the compiler invocations that DUB generates or
adding custom commands (aka pre/post build/generate commands)?


Does dub support the following scenario?

- There's a bunch of .java files that have to be compiled with javac.
- But some of the .java files are generated by an external tool, that
  must be run first, before the .java files are compiled.
- There's a bunch of .d files in two directories.
- The second directory contains .d files that need to be compiled
  into multiple executables, and they must be compiled with a local
  (i.e., non-cross) compiler.
- Some of the resulting executables must be run first in order to
  generate a few .d files in the first directory (in addition to
  what's already there).
- After the .d files are generated, the first directory needs to be
  compiled TWICE: once with a cross-compiler (LDC, targetting
  Arm/Android), once with the local D compiler. The first compilation
  must link with cross-compilation Android runtime libraries, and the
  second compilation must link with local X11 libraries.
   - (And obviously, the build products must be put in separate
 subdirectories to prevent stomping over each other.)
- After the .java and .d files are compiled, a series of tools must be
   invoked to generate an .apk file, which also includes a bunch of
   non-code files in resource subdirectories.  Then, another tool must be
   run to align and sign the .apk file.

And here's a critical requirement: any time a file is changed (it can be
a .java file, a .d file, or one of the resources that they depend on),
all affected build products must be correctly updated. This must be done
as efficiently as possible, because it's part of my code-compile-test
cycle, and if it requires more than a few seconds or recompiling the
entire codebase, it's a no-go.

If dub can handle this, then I'm suitably impressed, and retract most of
my criticisms against it. ;-)


This will currently realistically require invoking an external tool such 
as make through a pre/post-build command (although it may actually be 
possible to hack this together using sub packages, build commands, and 
string import paths for the file dependencies). Most notably, there is a 
directive missing to specify arbitrary files as build dependencies.


Another feature that should be there 

Re: A brief survey of build tools, focused on D

2018-12-11 Thread H. S. Teoh via Digitalmars-d-announce
On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via 
Digitalmars-d-announce wrote:
[...]
> The upgrade check has been disabled in one of the latest releases, so
> unless the dependencies haven't been resolved before, it will not
> access the network anymore. A notable exception are single-file
> packages, which don't have a dub.selections.json - we should probably
> do something about this, too, at some point.
> 
> I've also rewritten the dependency resolution a while ago and it
> usually is not noticeable anymore nowadays.
> 
> Then there was an issue where LDC was invoked far too frequently to
> determine whether it outputs COFF files or not, making it look like
> scanning the file system for changes took unacceptably long. This has
> also been fixed.

This is very encouraging to hear.  Thanks!


> The main open point right now AFAICS is to make --parallel work with
> the multiple-files-at-once build modes for machines that have enough
> RAM. This is rather simple, but someone has to do it. But apart from
> that, I think that the current state is relatively fine form a
> performance point of view.

Wait, what does --parallel do if it doesn't compile multiple files at
once?


> > Then it requires a specific source layout, with incomplete /
> > non-existent configuration options for alternatives.  Which makes it
> > unusable for existing code bases.  Unacceptable.
> 
> You can define arbitrary import/source directories and list (or
> delist) source files individually if you want. There are restrictions
> on the naming of the output binary, though, is that what you mean?

Is this documented? I couldn't find any info on it the last time I
looked.

Also, you refer to "the output binary". Does that mean I cannot
generate multiple executables? 'cos that's a showstopper for me.


> > Worst of all, it does not support custom build actions, which is a
> > requirement for many of my projects.  It does not support polyglot
> > projects. It either does not support explicit control over exact
> > build commands, or any such support is so poorly documented it might
> > as well not exist.  This is not only unacceptable, it is a
> > show-stopper.
> 
> Do you mean modifying the compiler invocations that DUB generates or
> adding custom commands (aka pre/post build/generate commands)?

Does dub support the following scenario?

- There's a bunch of .java files that have to be compiled with javac.
   - But some of the .java files are generated by an external tool, that
 must be run first, before the .java files are compiled.
- There's a bunch of .d files in two directories.
   - The second directory contains .d files that need to be compiled
 into multiple executables, and they must be compiled with a local
 (i.e., non-cross) compiler.
   - Some of the resulting executables must be run first in order to
 generate a few .d files in the first directory (in addition to
 what's already there).
   - After the .d files are generated, the first directory needs to be
 compiled TWICE: once with a cross-compiler (LDC, targetting
 Arm/Android), once with the local D compiler. The first compilation
 must link with cross-compilation Android runtime libraries, and the
 second compilation must link with local X11 libraries.
  - (And obviously, the build products must be put in separate
subdirectories to prevent stomping over each other.)
- After the .java and .d files are compiled, a series of tools must be
  invoked to generate an .apk file, which also includes a bunch of
  non-code files in resource subdirectories.  Then, another tool must be
  run to align and sign the .apk file.

And here's a critical requirement: any time a file is changed (it can be
a .java file, a .d file, or one of the resources that they depend on),
all affected build products must be correctly updated. This must be done
as efficiently as possible, because it's part of my code-compile-test
cycle, and if it requires more than a few seconds or recompiling the
entire codebase, it's a no-go.

If dub can handle this, then I'm suitably impressed, and retract most of
my criticisms against it. ;-)


T

-- 
Study gravitation, it's a field with a lot of potential.


Re: A brief survey of build tools, focused on D

2018-12-11 Thread Steven Schveighoffer via Digitalmars-d-announce

On 12/11/18 12:39 PM, H. S. Teoh wrote:

On Tue, Dec 11, 2018 at 09:58:39AM +, Atila Neves via 
Digitalmars-d-announce wrote:

On Monday, 10 December 2018 at 22:18:28 UTC, Neia Neutuladh wrote:

[...]

In typical D code, it's usually faster to compile per package than
either all-at-once or per module. Which is why it's the default in
reggae.


Yeah, for projects past a certain size, compiling per package makes the
most sense.


[...]

 From discussions on IRC about reducing compile times, though, using
Phobos is a good way to get slow compilation, and I use Phobos. That
alone means incremental builds are likely to go long.


Yes. Especially with -unittest.


We've talked about this before.  Jonathan Marler actually ran a test and
discovered that it wasn't something *directly* to do with unittests; the
performance hit was coming from some unexpected interactions with the
way the compiler instantiates templates when -unittest is enabled.  I
don't remember what the conclusion was, though.


I remember:

1. When unittests are enabled, -allinst is enabled as well.
2. This means that all templates instantiated are included as if they 
were part of the local module.
3. This means that they are semantically analyzed, and if they import 
anything, all those imports are processed as well

4. Recurse on step 2.

Note that the reason allinst is used is because sometimes templates 
compile differently when unittests are enabled. In other words, you 
might for instance get a different struct layout for when unittests are 
enabled -- this prevents that (but only for templates of course).


The ultimate reason why the PR (which removed the -allinst flag for 
unittests) was failing was because of differences in compiler flags for 
different modules during unittests in Phobos. This caused symbol name 
mangling changes (IIRC, mostly surrounding dip1000 problems).


I really wish we could have followed through on that PR...

-Steve


Re: A brief survey of build tools, focused on D

2018-12-11 Thread H. S. Teoh via Digitalmars-d-announce
On Tue, Dec 11, 2018 at 09:54:06AM +, Atila Neves via 
Digitalmars-d-announce wrote:
[...]
> No reggae? https://github.com/atilaneves/reggae/

I recently finally sat down and took a look at Button, posted here a few
years ago.  It looked pretty good.  One of these days I really need to
sit down and take a good look at reggae.


> dub is simple and has dependency management, and that's about it.
> Speed?  It's as slow as molasses and hits the network every time
> unless explicitly told not to. Never mind if there's already a
> dub.selections.json file and all of the dependencies are fetched
> (which it could check but doesn't).

According to Sönke's post elsewhere in this thread, these performance
issues have been addressed in the latest version.  I haven't tried it
out to verify that yet, though.


> Trying to do anything non-trivial in dub is a exercise in frustration.
> The problem is that it's the de facto D package manager, so as soon as
> you have dependencies you need dub whether you want to or not.

After fighting with dub for 2 days (or was it a week? it certainly felt
longer :-P) in my vibe.d project, I ended up just creating an empty
dummy project in a subdirectory that declares a dependency on vibe.d,
and run dub separately to fetch and build vibe.d, then I ignore the rest
of the dummy project and go back to the real project root and have SCons
build the real executable for me.  So far, that has worked reasonably
well, besides the occasional annoyance of having to re-run dub to update
to the latest vibe.d packages.


> dub works great if you're writing an executable with some dependencies
> and hardly any other needs. After that...

Yeah.  Being unable to handle generated source files is a showstopper
for many of my projects.  As Neia said, while D has some very nice
compile-time codegen features, sometimes you really just need to write
an external utility that generates source code.

For example, one of my current projects involves parsing GLSL source
files and generating D wrapper code as syntactic sugar for calls to
glUniform* and glAttrib* (so that I can just say `myshader.color =
Vector(1, 2, 3);` instead of manually calling glUniform* with fiddly,
error-prone byte offsets.  While in theory I could use string imports
and CTFE to do this, it's far less hairy to do this as an external step.

Most build systems with automatic dependency extraction would fail when
given this sort of setup, because they generally depend on scanning
directory contents, but in this case the file may not have been
generated yet (it would not be generated until the D code of the tool
that generates it is first compiled, then run). So the dependency would
be missed, resulting either in intermittent build failure or failure to
recompile dependents when the generated code changes.  It's not so
simple to just do codegen as a special preprocessing step -- such tasks
need to be treated as 1st class dependency tasks and handled natively as
part of DAG resolution, not as something tacked on as an afterthought.


T

-- 
Music critic: "That's an imitation fugue!"


Re: A brief survey of build tools, focused on D

2018-12-11 Thread H. S. Teoh via Digitalmars-d-announce
On Tue, Dec 11, 2018 at 09:58:39AM +, Atila Neves via 
Digitalmars-d-announce wrote:
> On Monday, 10 December 2018 at 22:18:28 UTC, Neia Neutuladh wrote:
[...]
> In typical D code, it's usually faster to compile per package than
> either all-at-once or per module. Which is why it's the default in
> reggae.

Yeah, for projects past a certain size, compiling per package makes the
most sense.


[...]
> > From discussions on IRC about reducing compile times, though, using
> > Phobos is a good way to get slow compilation, and I use Phobos. That
> > alone means incremental builds are likely to go long.
> 
> Yes. Especially with -unittest.

We've talked about this before.  Jonathan Marler actually ran a test and
discovered that it wasn't something *directly* to do with unittests; the
performance hit was coming from some unexpected interactions with the
way the compiler instantiates templates when -unittest is enabled.  I
don't remember what the conclusion was, though.

Either way, the unittest problem needs to be addressed.  I've been
running into problems with compiling my code with -unittest, because it
causes ALL unittests of ALL packages to be compiled, including Phobos
and external libraries.  It's making it very hard to manage exactly what
is unittested -- I want to unittest my *own* code, not any 3rd party
libraries or Phobos, but right now, there's no way to control that.

Recently I ran into a roadblock with -unittest: I have a project with
rather extensive unittests, but it assumes certain things about the
current working directory and the current environment (because those
unittests are run from a special unittest driver). I have that project
as a git submodule in a different project for experimental purposes, but
now I can't compile with -unittest because the former project's
unittests will fail, not being run in the expected environment. :-(

There needs to be a more fine-grained way of controlling which unittests
get compiled.  Generally, I don't see why I should care about unittests
for external dependencies (including Phobos) when what I really want is
to test the *current* project's code.


T

-- 
The two rules of success: 1. Don't tell everything you know. -- YHL


Re: A brief survey of build tools, focused on D

2018-12-11 Thread Sönke Ludwig via Digitalmars-d-announce

Am 10.12.2018 um 22:01 schrieb H. S. Teoh:

(...)

Convenience and simplicity, sure.  But speed? I'm sorry to say, I tried
dub for 2 days and gave up in frustration because it was making my
builds *several times longer* than a custom SCons script.  I find that
completely unacceptable.

It also requires network access.  On *every* invocation, unless
explicitly turned off.  And even then, it performs time-consuming
dependency resolutions on every invocation, which doubles or triples
incremental build times.  Again, unacceptable.


The upgrade check has been disabled in one of the latest releases, so 
unless the dependencies haven't been resolved before, it will not access 
the network anymore. A notable exception are single-file packages, which 
don't have a dub.selections.json - we should probably do something about 
this, too, at some point.


I've also rewritten the dependency resolution a while ago and it usually 
is not noticeable anymore nowadays.


Then there was an issue where LDC was invoked far too frequently to 
determine whether it outputs COFF files or not, making it look like 
scanning the file system for changes took unacceptably long. This has 
also been fixed.


The main open point right now AFAICS is to make --parallel work with the 
multiple-files-at-once build modes for machines that have enough RAM. 
This is rather simple, but someone has to do it. But apart from that, I 
think that the current state is relatively fine form a performance point 
of view.




Then it requires a specific source layout, with incomplete /
non-existent configuration options for alternatives.  Which makes it
unusable for existing code bases.  Unacceptable.


You can define arbitrary import/source directories and list (or delist) 
source files individually if you want. There are restrictions on the 
naming of the output binary, though, is that what you mean?



Worst of all, it does not support custom build actions, which is a
requirement for many of my projects.  It does not support polyglot
projects. It either does not support explicit control over exact build
commands, or any such support is so poorly documented it might as well
not exist.  This is not only unacceptable, it is a show-stopper.


Do you mean modifying the compiler invocations that DUB generates or 
adding custom commands (aka pre/post build/generate commands)?


Re: A brief survey of build tools, focused on D

2018-12-11 Thread Atila Neves via Digitalmars-d-announce

On Monday, 10 December 2018 at 22:18:28 UTC, Neia Neutuladh wrote:

On Mon, 10 Dec 2018 21:53:40 +, GoaLitiuM wrote:
The results for touching second file seems like an anomaly to 
me,


The generated ninja file had one rule per source file. If your 
modules tend to import each other a lot, or if they 
transitively import the code that's doing expensive stuff, then 
one rule per source file is bad. If your modules have few 
transitive dependencies and they're each fast to compile, one 
rule per source file is good.


In typical D code, it's usually faster to compile per package 
than either all-at-once or per module. Which is why it's the 
default in reggae.


My project used Pegged, and a lot of stuff referenced the 
grammar. That meant incremental builds went long and it would 
have been better to build the whole project at once.


Separating the grammar into a different build would reduce 
compile times significantly, and that might make incremental 
builds fast.


Using Pegged basically requires a dub subpackage with the grammar 
to retain one's sanity.


From discussions on IRC about reducing compile times, though, 
using Phobos is a good way to get slow compilation, and I use 
Phobos. That alone means incremental builds are likely to go 
long.


Yes. Especially with -unittest.



Re: A brief survey of build tools, focused on D

2018-12-11 Thread Atila Neves via Digitalmars-d-announce

On Monday, 10 December 2018 at 18:27:48 UTC, Neia Neutuladh wrote:
I wrote a post about language-agnostic (or, more accurately, 
cross- language) build tools, primarily using D as an example 
and Dub as a benchmark.


Spoiler: dub wins in speed, simplicity, dependency management, 
and actually working without modifying the tool's source code.


https://blog.ikeran.org/?p=339


No reggae? https://github.com/atilaneves/reggae/

dub is simple and has dependency management, and that's about it. 
Speed? It's as slow as molasses and hits the network every time 
unless explicitly told not to. Never mind if there's already a 
dub.selections.json file and all of the dependencies are fetched 
(which it could check but doesn't).


Trying to do anything non-trivial in dub is a exercise in 
frustration. The problem is that it's the de facto D package 
manager, so as soon as you have dependencies you need dub whether 
you want to or not.


dub works great if you're writing an executable with some 
dependencies and hardly any other needs. After that...


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Russel Winder via Digitalmars-d-announce
On Mon, 2018-12-10 at 13:01 -0800, H. S. Teoh via Digitalmars-d-announce
wrote:
> 
[…]
> Wow.  Thanks for the writeup that convinces me that I don't need to
> waste time looking at Meson/Ninja.
[…]

The article is a personal opinion and that is fine. For me it is wrong. No
mention of SCons, nor that Gradle build C++ as well as for the JVM languages.
Some of the points about Meson are right, some wrong, but it is a personal
opinion and that is fine. 

I shall continue to use Meson and Ninja because they are way, way better than
Autotools (not mentioned but still used a lot) and better than SCons for many
use cases. But this is also a personal opinion.

-- 
Russel.
===
Dr Russel Winder  t: +44 20 7585 2200
41 Buckmaster Roadm: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk



signature.asc
Description: This is a digitally signed message part


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Manu via Digitalmars-d-announce
On Mon, Dec 10, 2018 at 10:30 AM Neia Neutuladh via
Digitalmars-d-announce  wrote:
>
> I wrote a post about language-agnostic (or, more accurately, cross-
> language) build tools, primarily using D as an example and Dub as a
> benchmark.
>
> Spoiler: dub wins in speed, simplicity, dependency management, and
> actually working without modifying the tool's source code.
>
> https://blog.ikeran.org/?p=339

Why isn't premake in the list? It's the only buildtool that works
reasonably well with IDE's, and it's had D well supported for almost
6-7 years.
It also doesn't depend on a horrible runtime language distro.


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Neia Neutuladh via Digitalmars-d-announce
On Mon, 10 Dec 2018 13:01:08 -0800, H. S. Teoh wrote:
> It also requires network access.  On *every* invocation, unless
> explicitly turned off.  And even then, it performs time-consuming
> dependency resolutions on every invocation, which doubles or triples
> incremental build times.  Again, unacceptable.

I feel like those should be configuration options at the very worst. And 
dub probably shouldn't even bother verifying your dependencies if you 
haven't changed dub.json.

> Then it requires a specific source layout, with incomplete /
> non-existent configuration options for alternatives.  Which makes it
> unusable for existing code bases.  Unacceptable.

A lot of people do find it acceptable to have a build tool that makes 
assumptions about your source code layout, but that's certainly not always 
possible or desirable.

> Worst of all, it does not support custom build actions, which is a
> requirement for many of my projects.

Yeah, there's a lot of neat metaprogramming stuff in D (like pegged) where 
it's awesome with small projects that it's part of compilation, but when 
I'm dealing with a nontrivial instance of it, I want to split it into a 
separate build step. Dub doesn't help me accomplish that.

> After so many decades of "advancement", we're still stuck in the
> gratuitously incompatible walled gardens, like the gratuitous browser
> incompatibilities of the pre-W3C days of the Web. And on modern CPUs
> with GHz clock speeds, RAM measured in GBs, and gigabit download speeds,
> building Hello World with a system like dub (or Gradle, for that matter)
> is still just as slow (if not slower!) as running make back in the 90's
> on a 4 *kHz* processor.  It's ridiculous.

Solving an NP-complete problem every time you build is not a great start.

> Why can't modern source code come equipped with dependency information
> in a *standard format* that can be understood by *any* build system?

Kythe is an attempt to make the relevant information available in a 
language-agnostic way. Might be a reasonable basis for a standardized 
build system. No clue how well it works or what it actually supports.

https://kythe.io/

> Build systems shouldn't need to reinvent their own gratuitously
> incompatible DSL just to express what's fundamentally the same old
> decades-worn directed graph. And programmers shouldn't need to repeat
> themselves by manually enumerating individual graph edges (like Meson
> apparently does).

Meson doesn't have you enumerate individual graph edges at that level. It 
just doesn't build your project correctly. Change a struct size in one 
file, and you get a host of weird errors when another file uses it.

Maven and Gradle also don't really have a DAG like that. If any file 
changed, your whole project needs to be rebuilt, and all your dependencies 
are immutable. Bazel has a DAG across build rules, not across individual 
files.

> - Efficient: the amount of work done by the build should be proportional
>   to the size of changes made to the source code since the last build,
>   NOT proportional to the size of the entire source tree (SCons fails in
>   this regard).

Would be great if the tool could pay attention to whether incremental 
builds saved time on average and just do a full build if it's better.

> - Language-agnostic: the build system should be essentially a dependency
>   graph resolver. It should be able to compile (possibly via plugins)
>   source code of any language using any given compiler, provided such a
>   combination is at all possible. In fact, at its core, it shouldn't
>   even have the concept of "compilation" at all; it should be able to
>   generate, e.g., .png files from POVRay scene description files, run
>   image post-processing tools on them, then package them into a tarball
>   and upload it to a remote webserver -- all driven by the same
>   underlying DAG.

You could support rsync just fine, but if it's just an HTTP upload, 
there's no standard way to tell if the server's got the file already.


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Neia Neutuladh via Digitalmars-d-announce
On Tue, 11 Dec 2018 02:54:15 +, Mike Franklin wrote:
> Why not just write your build/tooling scripts in D?  That's what I
> prefer to do, and there's been a recent effort to do just that for the
> DMD compiler as well:
> https://github.com/dlang/dmd/blob/master/src/build.d  It still resembles
> the makefiles it was modeled from, but in time, I think it will clean up
> nicely.

That's fine for executables that don't depend on external libraries. It's 
not good for libraries that I want other people to use; dub's the easiest 
way to publish a thing. It also means I need to replicate that dependency 
graph logic in every single project, which is worse than replicating it 
once per language. We really should have a standard build tool supporting 
per-language plugins, like H. S. Teoh is recommending.


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Mike Franklin via Digitalmars-d-announce

On Monday, 10 December 2018 at 18:27:48 UTC, Neia Neutuladh wrote:
I wrote a post about language-agnostic (or, more accurately, 
cross- language) build tools, primarily using D as an example 
and Dub as a benchmark.


Spoiler: dub wins in speed, simplicity, dependency management, 
and actually working without modifying the tool's source code.


https://blog.ikeran.org/?p=339


Why not just write your build/tooling scripts in D?  That's what 
I prefer to do, and there's been a recent effort to do just that 
for the DMD compiler as well:  
https://github.com/dlang/dmd/blob/master/src/build.d  It still 
resembles the makefiles it was modeled from, but in time, I think 
it will clean up nicely.


Mike


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Neia Neutuladh via Digitalmars-d-announce
On Mon, 10 Dec 2018 21:53:40 +, GoaLitiuM wrote:
> The results for touching second file seems like an anomaly to me,

The generated ninja file had one rule per source file. If your modules 
tend to import each other a lot, or if they transitively import the code 
that's doing expensive stuff, then one rule per source file is bad. If 
your modules have few transitive dependencies and they're each fast to 
compile, one rule per source file is good.

My project used Pegged, and a lot of stuff referenced the grammar. That 
meant incremental builds went long and it would have been better to build 
the whole project at once.

Separating the grammar into a different build would reduce compile times 
significantly, and that might make incremental builds fast.

>From discussions on IRC about reducing compile times, though, using Phobos 
is a good way to get slow compilation, and I use Phobos. That alone means 
incremental builds are likely to go long.

> You also have to make sure the dependencies are built with the same
> compiler, which could explain the headache #3 in your article.

I've been using dmd as my primary compiler for ages, cleared out all the 
cached dub builds I could find, ran `dub build -v` to ensure that it was 
invoking dmd, and explicitly told Meson to use dmd.

Meson was still convinced that I'd built pegged with some other compiler.

> The comparison and some of the other headaches with meson does not seem
> to be fair as you are comparing dub, which is both a build system and a
> package manager, to meson which is only a build system, you have to make
> sure all the dependencies are installed to your system beforehand.

That *would* be a reasonable objection, but Meson explicitly advertises 
that you can use dub dependencies. The two flaws are the extra work 
required and the fact that it's broken. If it had not even pretended to 
support dub dependencies, I could have avoided several of the problems and 
just used git submodules from the start.

Just like with Bazel.


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Dennis via Digitalmars-d-announce

On Monday, 10 December 2018 at 21:01:08 UTC, H. S. Teoh wrote:

[SNIP]


Great rant! Do you think dub's current architecture is a lost 
cause or are there some leverage points where it can greatly 
improve? Also, do you have any recommendations? Currently I'm 
using dub because it's the standard, but I wish it were a bit 
faster and had more flexibility.





Re: A brief survey of build tools, focused on D

2018-12-10 Thread GoaLitiuM via Digitalmars-d-announce
I switched away from dub to meson for my small game engine 
project, and the biggest benefit of this switch was the improved 
build times while doing small iterations to some files:


dub build --arch=x86_64 --build=debug --compiler=dmd
- full rebuild: 3960ms
- touch file1 and build: 2780ms
- touch file2 and build: 2810ms

ninja -c build
- full rebuild: 10280ms (includes some dependencies like ErupteD 
and SDL2 bindings)

- touch file1 and build: 1410ms
- touch file2 and build: 250ms

The results for touching second file seems like an anomaly to me, 
but in practice the incremental build times are around the same 
with touching the first file, so that is already 2x improvement 
with incremental build times. If I touch multiple files, ninja 
can invoke multiple build commands at the same time so the work 
gets distributed along all the processor cores so the build time 
does not change a lot from editing one file (which does not seem 
to reflect in your build timing results for some reason?).



But as you mentioned in the article, there are some caveats with 
this, mainly the lack of dependency graph which may cause some 
weird bugs in the program if you drastically change one module or 
work with templates. You also have to make sure the dependencies 
are built with the same compiler, which could explain the 
headache #3 in your article.


The comparison and some of the other headaches with meson does 
not seem to be fair as you are comparing dub, which is both a 
build system and a package manager, to meson which is only a 
build system, you have to make sure all the dependencies are 
installed to your system beforehand. While I agree with the 
headache #1, other headaches are simply due to you not using 
meson the way it was intended to.


Re: A brief survey of build tools, focused on D

2018-12-10 Thread Paolo Invernizzi via Digitalmars-d-announce

On Monday, 10 December 2018 at 21:01:08 UTC, H. S. Teoh wrote:
And almost no build system handles reliable builds correctly 
when the build description is changed -- Button does, but it's 
in the extreme minority, and is still a pretty young project 
that's not widely known).


Tup [1] does, and it's pretty reliable on that: I think Botton 
was inspired by it.


Anyway, you are totally right: +1 on all your points!

[1] http://gittup.org/tup/

-- Paolo


Re: A brief survey of build tools, focused on D

2018-12-10 Thread H. S. Teoh via Digitalmars-d-announce
On Mon, Dec 10, 2018 at 06:27:48PM +, Neia Neutuladh via 
Digitalmars-d-announce wrote:
> I wrote a post about language-agnostic (or, more accurately, cross-
> language) build tools, primarily using D as an example and Dub as a
> benchmark.
> 
> Spoiler: dub wins in speed, simplicity, dependency management, and
> actually working without modifying the tool's source code.
> 
> https://blog.ikeran.org/?p=339

Wow.  Thanks for the writeup that convinces me that I don't need to
waste time looking at Meson/Ninja.

I find the current landscape of build systems pretty dismal. Dub may be
simple to use, but speed, seriously?! If *that's* the generally accepted
standard of build speed out there these days, then hope is slim.

Convenience and simplicity, sure.  But speed? I'm sorry to say, I tried
dub for 2 days and gave up in frustration because it was making my
builds *several times longer* than a custom SCons script.  I find that
completely unacceptable.

It also requires network access.  On *every* invocation, unless
explicitly turned off.  And even then, it performs time-consuming
dependency resolutions on every invocation, which doubles or triples
incremental build times.  Again, unacceptable.

Then it requires a specific source layout, with incomplete /
non-existent configuration options for alternatives.  Which makes it
unusable for existing code bases.  Unacceptable.

Worst of all, it does not support custom build actions, which is a
requirement for many of my projects.  It does not support polyglot
projects. It either does not support explicit control over exact build
commands, or any such support is so poorly documented it might as well
not exist.  This is not only unacceptable, it is a show-stopper.

This leaves only package management as the only thing about dub that I
could even remotely recommend (and even that is too unconfigurable for
my tastes -- basically, it's a matter of "take my way or the highway" --
but I'll give it credit for at least being *usable*, if not very
pleasant).  But given its limitations, it means many of my projects
*cannot* ever be dub projects, because they require multiple language
support and/or code generation rules that are not expressible as a dub
build.  Which means the package management feature is mostly useless as
far as my projects are concerned -- if I ever have a dependency that
requires code generation and/or multiple languages, dub is out of the
question.  So I'm back to square one as far as dependency management and
build system are concerned.

This dismal state of affairs means that if my code ever depends on a dub
package (I do have a vibe.d project that does), I have to use dub as a
secondary tool -- and even here dub is so inflexible that I could not
make coax it work nicely with the rest of my build system.  In my vibe.d
project I had to resort to creating a dummy empty project in a
subdirectory, whose sole purpose is to declare dependency on vibe.d so
that I can run dub to download and build vibe.d (and generate a dummy
executable that does nothing). Then I have to manually link in the
vibe.d build products in my real build system as a separate step.

//

Taking a step back, this state of affairs is completely ridiculous. The
various build systems out there are gratuitously incompatible with each
other, and having dependencies that cross build system boundaries is
completely unthinkable, even though at its core, it's exactly the same
miserable old directed acyclic graph, solved by the same old standard
graph algorithms.  Why shouldn't we be able to integrate subgraphs of
different origins into a single, unified dependency graph, with standard
solutions by standard graph algorithms?  Why should build systems be
effectively walled gardens, with artificial barriers that prevent you
from importing a Gradle dependency into a dub project, and importing
*that* into an SCons project, for example?

After so many decades of "advancement", we're still stuck in the
gratuitously incompatible walled gardens, like the gratuitous browser
incompatibilities of the pre-W3C days of the Web. And on modern CPUs
with GHz clock speeds, RAM measured in GBs, and gigabit download speeds,
building Hello World with a system like dub (or Gradle, for that matter)
is still just as slow (if not slower!) as running make back in the 90's
on a 4 *kHz* processor.  It's ridiculous.

Why can't modern source code come equipped with dependency information
in a *standard format* that can be understood by *any* build system?
Build systems shouldn't need to reinvent their own gratuitously
incompatible DSL just to express what's fundamentally the same old
decades-worn directed graph. And programmers shouldn't need to repeat
themselves by manually enumerating individual graph edges (like Meson
apparently does). It should be the compilers that generate this
information -- RELIABLY -- in a standard format that can be processed by
any tool that understands the common format.  You should be able to

Re: A brief survey of build tools, focused on D

2018-12-10 Thread Andre Pany via Digitalmars-d-announce

On Monday, 10 December 2018 at 18:27:48 UTC, Neia Neutuladh wrote:
I wrote a post about language-agnostic (or, more accurately, 
cross- language) build tools, primarily using D as an example 
and Dub as a benchmark.


Spoiler: dub wins in speed, simplicity, dependency management, 
and actually working without modifying the tool's source code.


https://blog.ikeran.org/?p=339


Pretty nice post, thanks for sharing this. Maybe it worths to 
note that dub is also able to retrieve packages from maven 
repositories.


Kind regards
Andre