On 14/06/2011 20:07, Andrei Alexandrescu wrote:
On 6/14/11 1:22 PM, Robert Clipsham wrote:
On 14/06/2011 14:53, Andrei Alexandrescu wrote:
http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
Destroy.
Andrei
This doesn't seem like the right solution to the problem - the correct
solution, in my opinion, is to have a build tool/package manager handle
this, not the compiler.
Problems I see:
* Remote server gets hacked, everyone using the library now
executes malicious code
This liability is not different from a traditional setup.
Perhaps, but with a proper package management tool this can be avoided
with sha sums etc, this can't happen with a direct get. Admittedly this
line of defense falls if the intermediate server is hacked.
* Remote source changes how it is built, your code suddenly breaks and
has to be updated, rather than being handled automatically
This is a deployment issue affecting this approach and any other relying
on downloading stuff.
It doesn't affect anything if a proper package management/build tool is
in use, as the remote code specifies how it is built, rather than the
local code.
* Adds a lot of unnecessary bloat and/or dependency on external modules
+ Want to compress source code? dmd now depends on decompression libs
Indeed, I think compression will indeed be commonly requested. The same
has happened about Java - initially it relied on downloading .class
files, but then jar files were soon to follow.
It's been a feature asked in this forum, independently of downloads. A
poster implemented a complete rdmd-like program that deals with .zip files.
+ Want to use git? dmd now depends on git
Not if the server can serve files, or if you use a different tool.
But then you lose the advantages of using git to get the source at all.
+ Remote code uses new compression method that an older dmd doesn't
support
If compression handling is needed, dmd can standardize on it just like
jar files do.
* Remote server is down - build takes forever while waiting
So does downloading or building with another tool.
Not so if you get all the source at once rather than depending on
getting it during build.
+ Make dmd time out after a couple of seconds - build fails
So would build directed with any other tool.
* Makes the assumption that the build machine is has internet
connectivity, if it doesn't building suddenly gets a lot more
complicated
Fair point.
For the previous few points, where you're unable to download the package
for whatever reason, it means you have to duplicate build instructions.
Do this, otherwise here's how to do it all manually.
* Source code changes location, build breaks unless a redirect is
possible - if it changes protocol it's useless
See my answer with a central repo.
My understanding is that you find automated download during the first
build untenable, but manual download prior to the first build
acceptable. I don't see such a large fracture between the two cases as
you do.
I don't have a problem with automatically downloading source during a
first build, I do see a problem with getting the compiler to do it
though. I don't believe the compiler should have anything to do with
getting source code, unless the compiler also becomes a package manager
and build tool.
--
Robert
http://octarineparrot.com/