Re: Next focus: PROCESS

2012-12-15 Thread RenatoUtsch

On Saturday, 15 December 2012 at 10:29:55 UTC, Dmitry Olshansky
wrote:

12/14/2012 3:34 AM, deadalnix пишет:

On Thursday, 13 December 2012 at 20:48:30 UTC, deadalnix wrote:
On Thursday, 13 December 2012 at 20:04:50 UTC, Dmitry 
Olshansky wrote:

I think it's good.

But personally I'd expect:

* master to be what you define as dev, because e.g. GitHub 
puts
master as default target branch when making pull requests. 
Yeah, I
know it's their quirk that it's easy to miss. Still leaving 
less
burden to check the branch label for both reviewers and pull 
authors

is a net gain.

* And what you describe as master (it's a snapshot or a 
pre-release

to me) to be named e.g. staging.

And we can as well drop the dead branch 'dev' then.


That sound also like a good plan to me.


Updated to follow the idea, plus added bunch of process 
description.

Feel free to comment in order to refine this.

http://wiki.dlang.org/Release_Process


I wasn't comfortable doing speculative edits to the wiki 
directly so here a few more comments:


I think one of major goals is to be able to continue ongoing 
development while at the _same time_ preparing a release. To me 
number one problem is condensed in the statement we are going 
to release do not merge anything but regressions the process 
should sidestep this lock-based work-flow. Could be useful to 
add something along these line to goals section. (i.e. the 
speed and smoothness is a goal)


Second point is about merging master into staging - why not 
just rewrite it with master branch altogether after each 
release?
master is the branch with correct history (all new stuff is 
rebased on it) thus new staging will have that too.


Here what I proposed on the discussion page, what do you think?

--
I have come with a good idea for the release schedule. Say that
we are 3 or 4 monthes before the next LTS release. We need to
branch the staging branch in a 2.N (put the contents of the
staging branch in the 2.N branch) branch, where N is the new LTS
version. Then, no other features can be included in this 2.N
branch, only bugfixes are allowed. This period will have one RC
every month (?) with the latest fixes in the 2.N branch. After
the 3 or 4 monthes period we'll tag the 2.N.0 release. After
every 4~6 monthes, we'll release a new 2.N.x version with the
latest bugfixes in the 2.N branch, but with no additional
features (including non-breaking features here will just be more
work to the devs, I don't think it is a good idea). While these
bugfix releases are being made, every feature that stabilizes
enough in the master branch is merged into the staging branch,
where the feature shouldn't have much breaking changes (although
changes are still allowed, the branch is not frozen). Every 3
monthes, dev releases are made with these features that made
their way into the staging branch. This is done while the fixes
in the 2.N branch are made. Then, 4 monthes prior to the next LTS
release, a new 2.N+1 branch will be created with the staging
branch contents. The cycle will repeat, these 4 monthes will have
no additional features on the 2.N+1 branch, and neither on the
next 3 years. This organization, in dev and LTS releases, will
allow releases for the ones that want a stable environment to
develop (the LTS releases) and the ones that want the latest
great features from D will have a somewhat stable environment
(the dev releases, somewhat like the ones we have now, maybe a
little more stable) to use. On top of that, the staging branch
will never be frozen, so development will never stop, as someone
was saying on the forums that was a bad idea. And, when the new
LTS release is made (2 years after the older LTS), the older LTS
will be mantained for more one year, what means that each LTS
will be mantained for 3 years. What do you think? RenatoUtsch
(talk) 14:14, 15 December 2012 (CET)
--
http://wiki.dlang.org/Talk:Release_Process#Expanding_LTS


Re: Compilation strategy

2012-12-15 Thread RenatoUtsch
On Saturday, 15 December 2012 at 17:05:59 UTC, Peter Alexander 
wrote:
On Saturday, 15 December 2012 at 16:55:39 UTC, Russel Winder 
wrote:
A quick straw poll.  Do people prefer to have all sources 
compiled in a
single compiler call, or (more like C++) separate compilation 
of each

object followed by a link call.


Single compiler call is easier for small projects, but I worry 
about compile times for larger projects...


Yes, I'm writing a build system for D (that will be pretty damn 
good, I think, it has some interesting new concepts), and 
compiling each source separately to an object, and then linking 
everything will allow easily to make the build parallel, dividing 
the sources to compile in various threads. Or the compiler 
already does that if I pass all source files in one call?


-- Renato Utsch


Re: Moving towards D2 2.061 (and D1 1.076)

2012-12-15 Thread RenatoUtsch

On Saturday, 15 December 2012 at 16:16:11 UTC, SomeDude wrote:
On Friday, 14 December 2012 at 01:26:35 UTC, Walter Bright 
wrote:

On 12/13/2012 5:10 PM, H. S. Teoh wrote:

Remedy adopting D


Saying that would be premature and incorrect at the moment. We 
still have to ensure that Remedy wins with D. This is an 
ongoing thing.


Yes, but what H.S. Theoh wrote about the desperate need of 
process is still true and correct. Like many others here, I 
think it's the biggest problem with D right now for its 
adoption. I for one will never consider using D for my main 
line of work without a true STABLE branch: a branch you can 
rely on. And yet I'm pretty sold to the language, but when your 
project is at stake, what you need is security. And the current 
development scheme doesn't provide that.


Yeah, but if people doesn't help to define a new process, no 
process will ever be defined.


We are trying to do something like that, any support or ideas 
will be helpful. The community needs to help to define this, and 
Walter already said that he will agree on what the community 
defines.


See:
http://wiki.dlang.org/Release_Process
http://forum.dlang.org/thread/ka5rv5$2k60$1...@digitalmars.com


Re: Compilation strategy

2012-12-15 Thread RenatoUtsch

On Saturday, 15 December 2012 at 18:00:58 UTC, H. S. Teoh wrote:

On Sat, Dec 15, 2012 at 06:31:17PM +0100, RenatoUtsch wrote:

On Saturday, 15 December 2012 at 17:05:59 UTC, Peter Alexander
wrote:
On Saturday, 15 December 2012 at 16:55:39 UTC, Russel Winder
wrote:
A quick straw poll.  Do people prefer to have all sources 
compiled
in a single compiler call, or (more like C++) separate 
compilation

of each object followed by a link call.

Single compiler call is easier for small projects, but I worry
about compile times for larger projects...

Yes, I'm writing a build system for D (that will be pretty damn
good, I think, it has some interesting new concepts), and 
compiling
each source separately to an object, and then linking 
everything
will allow easily to make the build parallel, dividing the 
sources
to compile in various threads. Or the compiler already does 
that if

I pass all source files in one call?

[...]

I find that the current front-end (common to dmd, gdc, ldc) 
tends to
work better when passed multiple source files at once. It tends 
to be
faster, presumably because it only has to parse 
commonly-imported files
once, and also produces smaller object/executable sizes -- 
maybe due to
fewer duplicated template instantiations? I'm not sure of the 
exact
reasons, but this behaviour appears consistent throughout dmd 
and gdc,
and I presume also ldc (I didn't test that). So based on this, 
I'd lean

toward compiling multiple files at once.


Yeah, I did read about this somewhee.

However, in very large project, clearly this won't work very 
well. If it
takes half an hour to build the entire system, it makes the 
code -
compile - test cycle very slow, which reduces programmer 
productivity.


So perhaps one possible middle ground would be to link packages
separately, but compile all the sources within a single package 
at once.
Presumably, if the project is properly organized, recompiling a 
single
package won't take too long, and has the perk of optimizing for 
size
within packages. This will probably also map to SCons easily, 
since

SCons builds per-directory.


T


Well, the idea is good. Small projects usually don't have much 
packages, so there will be just a few compiler calls. And 
compiling files concurrently will only have a meaningful efect if 
the project is large, and a large project will have a lot of 
packages.


Maybe adding an option to choose between compiling all sources at 
once, per package, or per source. For example, in development and 
debug builds the compilation is per file or package, but in 
release builds all sources are compiled at once, or various 
packages at once.


This way release builds will take advantage of this behavior that 
the frontend has, but developers won't have productivity issues. 
And, of couse, the behaviour will not be fixed, the devs that are 
using the build system will choose that.


Re: Compilation strategy

2012-12-15 Thread RenatoUtsch

On Saturday, 15 December 2012 at 18:24:50 UTC, jerro wrote:
On Saturday, 15 December 2012 at 17:31:19 UTC, RenatoUtsch 
wrote:
On Saturday, 15 December 2012 at 17:05:59 UTC, Peter Alexander 
wrote:
On Saturday, 15 December 2012 at 16:55:39 UTC, Russel Winder 
wrote:
A quick straw poll.  Do people prefer to have all sources 
compiled in a
single compiler call, or (more like C++) separate 
compilation of each

object followed by a link call.


Single compiler call is easier for small projects, but I 
worry about compile times for larger projects...


Yes, I'm writing a build system for D (that will be pretty 
damn good, I think, it has some interesting new concepts)


I took a look at your github project, there isn't any code yet, 
but I like the concept. I was actually planing to do something 
similar, but since you are already doing it, I think my time 
would be better spent contributing to your project. Will there 
be some publicly available code in the near future?


I expect to release a first alpha version in about 15~30 days, 
maybe less, it depends on how much time I will have on the rest 
of this month.


Re: Compilation strategy

2012-12-15 Thread RenatoUtsch

On Saturday, 15 December 2012 at 18:44:35 UTC, H. S. Teoh wrote:

On Sat, Dec 15, 2012 at 07:30:52PM +0100, RenatoUtsch wrote:
On Saturday, 15 December 2012 at 18:00:58 UTC, H. S. Teoh 
wrote:

[...]
So perhaps one possible middle ground would be to link 
packages
separately, but compile all the sources within a single 
package at
once.  Presumably, if the project is properly organized, 
recompiling
a single package won't take too long, and has the perk of 
optimizing

for size within packages. This will probably also map to SCons
easily, since SCons builds per-directory.

[...]

Well, the idea is good. Small projects usually don't have much
packages, so there will be just a few compiler calls. And 
compiling
files concurrently will only have a meaningful efect if the 
project

is large, and a large project will have a lot of packages.


Yes, that's the idea behind it.


Maybe adding an option to choose between compiling all sources 
at
once, per package, or per source. For example, in development 
and
debug builds the compilation is per file or package, but in 
release
builds all sources are compiled at once, or various packages 
at once.


This way release builds will take advantage of this behavior 
that
the frontend has, but developers won't have productivity 
issues.
And, of couse, the behaviour will not be fixed, the devs that 
are

using the build system will choose that.


I forgot to mention also, that passing too many source files to 
the
compiler may sometimes cause memory consumption issues, as the 
compiler
has to hold everything in memory. This may not be practical for 
very

large project, where you can't fit everything into RAM.


T


Well, so compiling by packages seem to be the best approach. When 
I return home I will do some tests to see what I can do.


-- Renato


Re: Next focus: PROCESS

2012-12-15 Thread RenatoUtsch

On Saturday, 15 December 2012 at 20:39:22 UTC, deadalnix wrote:
On Saturday, 15 December 2012 at 20:32:42 UTC, Jesse Phillips 
wrote:
On Saturday, 15 December 2012 at 10:29:55 UTC, Dmitry 
Olshansky wrote:
Second point is about merging master into staging - why not 
just rewrite it with master branch altogether after each 
release?
master is the branch with correct history (all new stuff is 
rebased on it) thus new staging will have that too.


Why you don't rewrite is because it is a public branch. Unlike 
feature branches which will basically be thrown out everyone 
on the development team will need to have staging updated. If 
we rewrite history then instead of


$ git pull staging

At random times it will be (I don't know the commands and 
won't even look it up)


It just won't be pretty.


I've made modifications to the graphic hoping to illustrate 
some thoughts.


http://i.imgur.com/rJVSg.png

This does not depict what is currently described (in terms of 
branching). But is what I've written under 
http://wiki.dlang.org/Release_Process#Release_Schedule


I see patches going into the LTS-1 (if applicable), the LTS-1 
is then merged into the latest LTS, which is merged into any 
active staging, that is then merged into master.


The monthly release don't get bug fixes (just wait for the 
next month).


I've removed some version numbering since I don't know if we 
should have a distinct numbering for LTS and Monthly. I've 
already give some thoughts on this: 
http://forum.dlang.org/post/ydmgqmbqngwderfkl...@forum.dlang.org


Can we drop the LTS name ? It reminds me of ubuntu, and I 
clearly hope that people promoting that idea don't plan to 
reproduce ubuntu's scheme :
 - it is not suitable for a programming language (as stated 3 
time now, so just read before why I won't repeat it).

 - ubuntu is notoriously unstable.


Of course, lets just call it stable, then. Or you have a better 
name?


Anyways, I do think that stable releases every 3 or more years 
and monthly or every 3 months releases are the best solution to 
the current D users.


-- Renato


Re: Next focus: PROCESS

2012-12-13 Thread RenatoUtsch
On Thursday, 13 December 2012 at 15:44:25 UTC, Joseph Rushton 
Wakeling wrote:

On 12/13/2012 10:07 AM, Jonathan M Davis wrote:
That makes a _lot_ more sense than the unstable = testing = 
stable model.


I like the idea of having an LTS release at some interval 
(probably 1 year)
where that branch has bug fix releases more or less monthly. 
We then have a dev
release cycle where we put out a version of the master branch 
every 2 or 3
months or so, where it includes whatever new features we want 
to include but
haven't been merged into the LTS branch yet. Then with the 
next major LTS
release, we merge in all of the new stuff from the master 
branch so that it's
essentially up to date with master (possibly modulo some newer 
features or

other major changes if they weren't deemed ready yet).


I wonder if I maybe created some confusion by conflating LTS 
and stable -- I don't think I should really have done that.


The problem that I see with what you describe is that it gives 
no clear mechanism for how breaking changes should be handled 
-- I don't mean the decision-making process of Should we make 
this change? but the means in which users are supported in 
making the shift from one LTS to another.


So, perhaps it would help if I proposed instead:

   -- A _stable_ release is what we all understand, a release 
that has been
  properly tested and has no outstanding bug reports at the 
time of first

  release, and to which only bugfixes are subsequently made

   -- LTS should refer to a _cycle_ of stable releases within 
which no breaking

  changes are made

   -- Successive LTS cycles should overlap so that the previous 
cycle is still
  receiving bugfixes for some time after the new LTS cycle 
begins.  In this
  way, users have a safe period in which to update their 
codebases to the

  new LTS.

So -- forgetting D's versioning scheme for a moment -- suppose 
you label a given LTS cycle with N.  Then N.0, N.1, N.2, ... 
would be the stable releases containing new but non-breaking 
features, which might be made annually, or every 6 months.  
N.x.0, N.x.1, N.x.2, ... would be the bugfix-only updates to 
those releases, made as often as necessary.


Once LTS cycle N+1 has begun, then you stop making N.x releases 
and only apply further bug fixes.  Likewise, bugfixes to N.x 
stop getting released once N.x+1 is released.


So, if you assume a stable release annually, and that an LTS 
cycle lasts 3 years, you might have:


1st yearN.0.0
- N.0.1   [bugfixes]
- N.0.2
  ...

2nd yearN.1.0   [non-breaking new features]
- N.1.1
- N.1.2
  ...
 [new LTS cycle begins, 
maybe with

  breaking changes]
3rd yearN.2.0   N+1.0.0
- N.2.1   - N+1.0.1
- N.2.2   - N+1.0.2
  ...   ...

4th yearN+1.1.0[entirely moved over to new LTS]
  - N+1.1.1
  - N+1.1.2
...

5th yearN+1.2.0 N+2.0.0 [new LTS begins]

6th yearN+2.1.0

7th yearN+2.2.0 N+3.0.0

... etc.

Of course, the timeline could be altered but the point would be 
that for every LTS cycle you'd have up to a year in which to 
switch, and _within_ every LTS cycle you would receive stable 
updates with bugfixes (regularly) and well-tested new features 
(annually, 6-monthly?).


That's surely more work for developers but is probably worth it 
in terms of security for users of D.  In the event that a new 
LTS is accompanied by no breaking changes, you might do away 
with the year's overlap to save the extra work.


If you want to translate the above into actual D version 
numbers, you could do it in the form V.N.x.y where V is the D 
language version (2 for the foreseeable future:-), N the LTS 
cycle, etc.


That leaves out dev releases, but you might want to number 
them differently anyway -- e.g. dev-mmdd or whatever.


As for the standard library and its releases, I wouldn't 
really expect them to
be decoupled from the compiler's releases but rather that they 
would continue
more or less as they have been. I think that it's too often 
the case that
changes in them need to be made at the same time for it to 
really work to

decouple them at this point.


My feeling here was that as it stands Phobos is still somewhat 
more in flux than D itself, not so much in terms of breaking 
changes as in terms of new features being added.  So, I was 
wondering if it might be worthwhile to make new-Phobos-feature 
releases on a more regular basis at least in the short term 
while the library is still being expanded.  But I won't press 
for it too much.


This approach looks the best to D. As we have the problem that 
some 

Re: Moving towards D2 2.061 (and D1 1.076)

2012-12-13 Thread RenatoUtsch
On Thursday, 13 December 2012 at 21:37:07 UTC, Walter Bright 
wrote:

On 12/13/2012 12:46 PM, Jacob Carlborg wrote:

On 2012-12-13 18:27, Iain Buclaw wrote:


I am confused at this commit also.


Walter argues that people are already using it so it can't 
just be removed. I
say, they're using an unreleased version of DMD, this is to be 
expected.




They have a large code base, and are using it heavily.


If they used a development compiler to write their code, they 
should be able to continue doing that. This feature needs testing 
before being rock solid and good to release.


Please, lets not release something not thoroughly tested when we 
are in the middle of the new development process discussion, we 
are trying to avoid exactly this kind of thing. Even if the 
feature is not introducing backwards incompatible changes, this 
will cause problems when we discover, in the future, that we made 
some bad decision and won't be able to change it anymore.


Re: Moving towards D2 2.061 (and D1 1.076)

2012-12-13 Thread RenatoUtsch
On Thursday, 13 December 2012 at 23:47:56 UTC, Walter Bright 
wrote:

On 12/13/2012 1:44 PM, deadalnix wrote:
You are engaging the whole community into something you 
dropped here by surprise
and then claiming that some people uses. We don't even know 
who they are ! How

can we support your point ?


It's Remedy Games. It's a big deal for them, and their use of D 
is a big deal for us, big enough that we can bend our procedure 
for them. They were also under severe time pressure. They began 
using UDAs the same day I implemented them. Remedy could very 
well be the tipping point for D, and I'm not going to allow 
them to fail.


It's also not a conflict of interest - what they want from D is 
really what we all want from it.


I understand that some of you may be frustrated by my giving 
their needs priority, but I hope that the end result will be 
better for all of us than if I didn't, and that you'll indulge 
me with this.


Maybe this is a special occasion.

To have such a big enterprise starting to use D, maybe there are 
more pros than cons.


Linker error

2012-11-30 Thread RenatoUtsch

Hello,

I was trying to play with modules and import, but I couldn't 
understand why this code is giving linker errors:


/classy/widget.d

module classy.widget;

class ReturnItself
{
public ref ReturnItself returnItself()
{
return this;
}

public int returnNumber(in int number)
{
return number;
}
}


/testReturn.d

import std.stdio, std.conv, classy.widget;

void main()
{
auto returnItself = new ReturnItself;
int number = 13;

writeln(to!string(returnItself.returnItself()
.returnItself()
.returnItself()
.returnItself()
.returnNumber(number)));
}


The linker error is the following:

$ dmd testReturn.d
testReturn.o: In function `_Dmain':
testReturn.d:(.text._Dmain+0xa): undefined reference to 
`_D6classy6widget12ReturnItself7__ClassZ'

collect2: error: ld returned 1 exit status
--- errorlevel 1
$


I am using Fedora 17, x86_64.


I will be grateful for all help I can get.


Re: Linker error

2012-11-30 Thread RenatoUtsch

On Friday, 30 November 2012 at 19:09:12 UTC, Maxim Fomin wrote:

On Friday, 30 November 2012 at 18:44:31 UTC, RenatoUtsch wrote:

skipped
$ dmd testReturn.d
...


because you didn't compiled the second file.


Man, I was going to say that it didn't work, that I tested it 
before, but then I noticed that I added classy.widget.d to the 
command line, not classy/widget.d, it worked.


Sorry for the dumb mistake.


Re: GDC is this a bug or a feature?

2012-11-11 Thread RenatoUtsch

On Sunday, 11 November 2012 at 10:39:41 UTC, Russel Winder wrote:


| gdc -o lib_helloWorld.so -shared helloWorld.os
/usr/bin/ld: 
/usr/lib/gcc/x86_64-linux-gnu/4.6/libgphobos2.a(object_.o): 
relocation R_X86_64_32S against `_D11TypeInfo_Pv6__initZ' can 
not be used when making a shared object; recompile with -fPIC
/usr/lib/gcc/x86_64-linux-gnu/4.6/libgphobos2.a: could not read 
symbols: Bad value

collect2: ld returned 1 exit status


I've had the same problem when trying to compile a shared 
library, but with dmd.


It turned out that to make a shared library you can't use phobos 
dependencies because it isn't ready to work with shared 
libraries. Indeed, if you remove any phobos dependencies, the 
shared library is compiled successfully.


Well, that's the noob explanation, I don't know the details of 
why phobos doesn't work. My bet is that it is shipped as a static 
library, and (if I'm not wrong) you can't link a static library 
on a shared library...


Still, that is a big problem that D has.


Re: Make [was Re: SCons and gdc]

2012-10-27 Thread RenatoUtsch

On Saturday, 27 October 2012 at 03:33:37 UTC, H. S. Teoh wrote:

On Sat, Oct 27, 2012 at 04:11:02AM +0200, RenatoUtsch wrote:

On Friday, 26 October 2012 at 22:15:13 UTC, Rob T wrote:

[...]

At this point I'm considering looking at those old build tools
written in D, perhaps I can patch one of them up to get it to 
do

what I want.

If anyone has a suggestion as to which of the (I think) 2 or 3
build tools coded in D looked the best, that would be 
appreciated.


--rt

I am currently learning more D to design a new one, that should
really end the need for other ones. If you can wait 1 or 2 
months

for an alpha release...

If anyone has any suggestion, I would be thankful.


If you're going to write a new build tool, you should at least 
take a

look at some of the ideas and concepts in tup:

http://gittup.org/tup/

Do take some time to read the PDF under additional info; it 
is very

insightful and contains some possibly revolutionary algorithms.
 Don't
just reinvent the same old broken system that make was 30 years 
ago,

just with different packaging and eye-candy.

(Caveat: I've never used tup before. I just read up on it 
today, and was
quite impressed by some of the innovations. Even if you decide 
to take
another approach, you should at least be aware of what's 
cutting-edge
these days so that you aren't just rehashing the same old 
stuff.)



T


Tup has some interesting concepts, maybe I can adapt them to my 
project.


But I was thinking in making something more authomatized, like 
CMake or SCons but with tons of innovations.


Re: dlang.org slow?

2012-10-26 Thread RenatoUtsch

On Friday, 26 October 2012 at 08:18:30 UTC, Jens Mueller wrote:

H. S. Teoh wrote:
Is it just me, or has dlang.org been really slow today? Is 
something
wrong with the site that needs our attention? It takes almost 
2 whole
minutes to load a page -- at first I thought my office network 
screwed

up again, but now I'm at home and it's still like that.


Same here. It is doing that for at least two days.
It's getting really annoying.

Jens


Yeah, having that too, and I'm in Brazil.


Re: dlang.org slow?

2012-10-26 Thread RenatoUtsch

On Friday, 26 October 2012 at 08:40:56 UTC, RenatoUtsch wrote:

On Friday, 26 October 2012 at 08:18:30 UTC, Jens Mueller wrote:

H. S. Teoh wrote:
Is it just me, or has dlang.org been really slow today? Is 
something
wrong with the site that needs our attention? It takes almost 
2 whole
minutes to load a page -- at first I thought my office 
network screwed

up again, but now I'm at home and it's still like that.


Same here. It is doing that for at least two days.
It's getting really annoying.

Jens


Yeah, having that too, and I'm in Brazil.


Oh, but not today, I was having that yesterday :P


Re: Make [was Re: SCons and gdc]

2012-10-26 Thread RenatoUtsch

On Friday, 26 October 2012 at 22:15:13 UTC, Rob T wrote:
I'm trying to do a very simple build, but with scons I find 
myself spending much more time with it (and getting nowhere) 
than the actual coding, and that tells me that it's no better 
and may be even worse than Make.


As an example of the sort of nonsense scons dishes out, I 
cannot in a reasonable way, specify a build folder above my 
root source folder, such a thing should be extremely easy to 
specify, but no it isn't.


I also found that scons is lacking basic features that any 
build tool should have. For example, when I tried to generate a 
list of my source files, I discovered that scons it has no 
built in ability to recursively scan a folder to include all 
subfolders. WTF!?


I know I can write custom code in python to get around these 
problems, but that will depeat the entire purpose of having a 
build tool in the first place.


scons also comes with a ton of dependency baggage that I simply 
do not need, therefore I am giving up on it entirely.


At this point I'm considering looking at those old build tools 
written in D, perhaps I can patch one of them up to get it to 
do what I want.


If anyone has a suggestion as to which of the (I think) 2 or 3 
build tools coded in D looked the best, that would be 
appreciated.


--rt


I am currently learning more D to design a new one, that should 
really end the need for other ones. If you can wait 1 or 2 months 
for an alpha release...


If anyone has any suggestion, I would be thankful.


Re: Is there any way to create something like this?

2012-10-22 Thread RenatoUtsch
That static version you made Adam was just perfect for what I 
need!


Thanks for the help!

Renato

On Sunday, 21 October 2012 at 15:19:24 UTC, Adam D. Ruppe wrote:

On Sunday, 21 October 2012 at 13:10:16 UTC, RenatoUtsch wrote:
Is there any way to make the GeneralLibrary class and the 
createLibrary() (or even better, make the GeneralLibrary 
constructor do that) to work with this kind of construction?


Yes, there's ways to do that, but they might not be what you 
had in mind, because you'll either have to use different static 
types for each version or make GeneralLibrary a dynamic type, 
which means it will error at runtime rather than compile time.


Either way, createLibrary will probably have to be a template.

Here's an example of a static wrapper:

 I just accidentally saved over the static example with the 
dynamic example. Ugh.


I guess what I'll do is link it in:

http://arsdnet.net/dcode/example_dynamic.d

Look at the function createLibrary and pretend the 
dynamicFunctionCall method wasn't there. That's what the static 
example looked like - just opDispatch.


In main, to use the static checks, you must say auto library11 
instead of GeneralLibrary. The interface is only useful for 
dynamic calls. auto gives you the wrapper class with static 
checks.


This isn't very interesting because the wrapper adds nothing; 
you might as well just construct the original object. But you 
could change the function in the wrapper to do something else.


But anyway, each wrapper class created inherits from a 
GeneralLibrary interface, so you could pass them around.. but 
since the interface does nothing, you really have to use auto 
on the return type to actually use the class.


If you aren't doing anything with the wrapper, you could also 
just alias GeneralLibrary to be the newest version of the 
actual class too.




Not much fun so let's look at a dynamic option. This will suck 
too, but in different ways.



This one won't work with overloaded functions, default 
parameters on functions (sad, fixable in some cases, but not 
all), has a speed hit (you could help this a bit by optimizing 
dynamicFunctionCall, but it will always have some slowdown), 
and returns Variants instead of the original type, which is 
kinda annoying.


And, of course, it is a dynamic call, so any failures will only 
happen at runtime.


Here's some code:
http://arsdnet.net/dcode/example_dynamic.d


I haven't tested it fully, but it seems to throw exceptions at 
the right times for the simple functions in here.


This code is messier and includes a compiler bug hack (I 
think)... but it worked. There's some comments in there to talk 
about what it does.


End result:

given
GeneralLibrary library11 = createLibrary!(1.1);

library11.func1(); // Should work
library11.func11(); // Should work
library11.func12(); // Should fail // line 103

Will throw at runtime:
test4.NoSuchMethodException@test4.d(103): No such method: func12


And it *should* work with types too, still doing strong type 
checks, but doing them at runtime for the arguments. All return 
values for these functions are wrapped in Variants, so you'll 
have to pull them out dynamically too..


You could probably combine these two examples and have static 
types if you use auto and dynamic if you use the GeneralLibrary 
interface.





Another option might be to have the createLibrary function be 
aware of all the versions - you'd have to hard code a list that 
it can read at compile time - and then do the kind of thing in 
the static example, but trying to cast to the newer versions of 
the class and throwing if it fails. Then you'd keep the static 
types, but get dynamic checks on if the method is available.


This would be just a loop, cast to a newer class, if not null 
and the call compiles, call the function.


I'm out of screwing around time this morning so I won't write 
this one up, but  it should be doable.





Is there any way to create something like this?

2012-10-21 Thread RenatoUtsch

Hello,

I was trying to imagine how to do a thing such the one as below 
(or similar), because it would be great for the project I am 
creating...


Is there any way to create some kind of object that resembles 
different objects (like a superclass with the subclasses, but 
with transparent access to the subclasses new methods)?


For example:

[code]
void main() {
GeneralLibrary library = createLibrary(1.0);
GeneralLibrary library11 = createLibrary(1.1);
GeneralLibrary library12 = createLibrary(1.2);
GeneralLibrary library2 = createLibrary(2.0);

library.func1(); // Should work
library.func11(); // Should fail
library.func12(); // Should fail
library3.func2(); // Should fail

library2.func1(); // Should work
library2.func11(); // Should work
library21.func12(); // Should fail
library3.func2(); // Should fail

library2.func1(); // Should work
library2.func11(); // Should work
library2.func12(); // Should work
library3.func2(); // Should fail

library3.func1(); // Should fail
library3.func11(); // Should fail
library3.func12(); // Should fail
library3.func2(); // Should work
}

class Library_1_0 {
void func1() {}
}

class Library_1_1 : Library_1_0 {
void func11() {}
}

class Library_1_2 : Library_1_1 {
void func12() {}
}

class Library_2_0 {
void func2() {}
}
[/code]

Is there any way to make the GeneralLibrary class and the 
createLibrary() (or even better, make the GeneralLibrary 
constructor do that) to work with this kind of construction?


Re: Is there any way to create something like this?

2012-10-21 Thread RenatoUtsch

Sorry, the code is wrong, the fixed code is:

void main() {
GeneralLibrary library = createLibrary(1.0);
GeneralLibrary library11 = createLibrary(1.1);
GeneralLibrary library12 = createLibrary(1.2);
GeneralLibrary library2 = createLibrary(2.0);

library.func1(); // Should work
library.func11(); // Should fail
library.func12(); // Should fail
library.func2(); // Should fail

library11.func1(); // Should work
library11.func11(); // Should work
library11.func12(); // Should fail
library11.func2(); // Should fail

library12.func1(); // Should work
library12.func11(); // Should work
library12.func12(); // Should work
library12.func2(); // Should fail

library2.func1(); // Should fail
library2.func11(); // Should fail
library2.func12(); // Should fail
library2.func2(); // Should work
}

class Library_1_0 {
void func1() {}
}

class Library_1_1 : Library_1_0 {
void func11() {}
}

class Library_1_2 : Library_1_1 {
void func12() {}
}

class Library_2_0 {
void func2() {}
}