On Sunday, 3 September 2017 at 04:18:03 UTC, EntangledQuanta
wrote:
On Sunday, 3 September 2017 at 02:39:19 UTC, Moritz Maxeiner
wrote:
On Saturday, 2 September 2017 at 23:12:35 UTC, EntangledQuanta
wrote:
[...]
The contexts being independent of each other doesn't change
that we would still be overloading the same keyword with three
vastly different meanings. Two is already bad enough imho (and
if I had a good idea with what to replace the "in" for AA's
I'd propose removing that meaning).
Why? Don't you realize that the contexts matters and [...]
Because instead of seeing the keyword and knowing its one meaning
you also have to consider the context it appears in. That is
intrinsically more work (though the difference may be very small)
and thus harder.
Again, I'm not necessarily arguing for them, just saying that
one shouldn't avoid them just to avoid them.
[...]
It's not about ambiguity for me, it's about readability. The
more significantly different meanings you overload some
keyword - or symbol, for that matter - with, the harder it
becomes to read.
I don't think that is true. Everything is hard to read. It's
about experience. The more you experience something the more
clear it becomes. Only with true ambiguity is something
impossible. I realize that in one can design a language to be
hard to parse due to apparent ambiguities, but am I am talking
about cases where they can be resolved immediately(at most a
few milliseconds).
Experience helps, of course, but it doesn't change that it's
still just that little bit slower. And everytime we encourage
such overloading encourages more, which in the end sums up.
You are making general statements, and it is not that I
disagree, but it depends on context(everything does). In this
specific case, I think it is extremely clear what in means, so
it is effectively like using a different token. Again, everyone
is different though and have different experiences that help
them parse things more naturally. I'm sure there are things
that you might find easy that I would find hard. But that
shouldn't stop me from learning about them. It makes me
"smarter", to simplify the discussion.
I am, because I believe it to be generally true for "1 keyword
|-> 1 meaning" to be easier to read than "1 keyword and 1 context
|-> 1 meaning" as the former inherently takes less time.
[...]
Well, yes, as I wrote, I think it is unambiguous (and can thus
be used), I just think it shouldn't be used.
Yes, but you have only given the reason that it shouldn't be
used because you believe that one shouldn't overload keywords
because it makes it harder to parse the meaning. My rebuttal,
as I have said, is that it is not harder, so your argument is
not valid. All you could do is claim that it is hard and we
would have to find out who is more right.
As I countered that in the above, I don't think your rebuttal is
valid.
I have a logical argument against your absolute restriction
though... in that it causes one to have to use more symbols. I
would imagine you are against stuff like using "in1", "in2",
etc because they visibly are to close to each other.
It's not an absolute restriction, it's an absolute position from
which I argue against including such overloading on principle.
If it can be overcome by demonstrating that it can't sensibly be
done without more overloading and that it adds enough value to be
worth the increases overloading, I'd be fine with inclusion.
[...]
I would much rather see it as a generalization of existing
template specialization syntax [1], which this is t.b.h. just
a superset of (current syntax allows limiting to exactly one,
you propose limiting to 'n'):
---
foo(T: char) // Existing syntax: Limit T to the single
type `char`
foo(T: (A, B, C)) // New syntax: Limit T to one of A, B,
or C
---
Yes, if this worked, I'd be fine with it. Again, I could care
less. `:` == `in` for me as long as `:` has the correct meaning
of "can be one of the following" or whatever.
But AFAIK, : is not "can be one of the following"(which is "in"
or "element of" in the mathematical sense) but can also mean
"is a derived type of".
Right, ":" is indeed an overloaded symbol in D (and ironically,
instead of with "in", I think all its meanings are valuable
enough to be worth the cost). I don't see how that would
interfere in this context, though, as we don't actually overload
a new meaning (it's still "restrict this type to the thing to the
right").
If that is the case then go for it ;) It is not a concern of
mine. You tell me the syntax and I will use it. (I'd have no
choice, of course, but if it's short and sweet then I won't
have any problem).
I'm discussing this as a matter of theory, I don't have a use for
it.
[...]
Quoting a certain person (you know who you are) from DConf
2017: "Write a DIP".
I'm quite happy to discuss this idea, but at the end of the
day, as it's not an insignificant change to the language
someone will to do the work and write a proposal.
My main issues with going through the trouble is that basically
I have more important things to do. If I were going to try to
get D to do all the changes I actually wanted, I'd be better
off writing my own language the way I envision it and want
it... but I don't have 10+ years to invest in such a beast and
to do it right would require my full attention, which I'm not
willing to give, because again, I have better things to
do(things I really enjoy).
So, all I can do is hopefully stoke the fire enough to get
someone else interested in the feature and have them do the
work. If they don't, then they don't, that is fine. But I feel
like I've done something to try to right a wrong.
That could happen, though historically speaking, usually things
have gotten included in D only when the major proponent of
something like this does the hard work (otherwise they seem to
just fizzle out).
[...]
AFAIK the difference between syntax sugar and enabling syntax
in PLs usually comes down to the former allowing you to
express concepts already representable by other constructs in
the PL; when encountered, the syntax sugar could be lowered by
the compiler to the more verbose syntax and still be both
valid in the PL and recognizable as the concept (while this is
vague, a prominent example would be lambdas in Java 8).
Yes, but everything is "lowered" it's just how you define it.
Yes and w.r.t to my initial point, I did define it as "within the
PL itself, preserving the concept".
[...]
Why do you think that? Less than ten people have participated
in this thread so far.
I am not talking about just this thread, I am talking about in
all threads and all things in which humans attempt to determine
the use of something. [...]
Fair enough, though personally I'd need to see empirical proof of
those general claims about human behaviour before I could share
that position.
[...]
Why do you assume that? I've not seen anyone here claiming
template parameter specialization to one of n types (which is
the idea I replied to) couldn't be done in theory, only that
it can't be done right now (the only claim as to that it can't
be done I noticed was w.r.t. (unspecialized) templates and
virtual functions, which is correct due to D supporting
separate compilation; specialized templates, however, should
work in theory).
Let me quote the first two responses:
"It can't work this way. You can try std.variant."
That is a reply to your mixing (unspecialized) templates and
virtual functions, not to your idea of generalizing specialized
templates.
and
"It is not possible to have a function be both virtual and
templated. A function template generates a new function
definition every time that it's a called with a new set of
template arguments. [...]"
Same here.
Now, I realize I might have no been clear about things and
maybe there is confusion/ambiguity in what I meant, how they
interpreted it, or how I interpreted their response... but
there is definitely no sense of "Yes, we can make this work in
some way..." type of mentality.
e.g., "Templates and virtual functions simply don't mix."
That is an absolute statement. It isn't even qualified with "in
D".
[...]
Actually, I disagree here. It only *needs* filling if enough
users of the language actually care about it not being there.
Otherwise, it's a *nice to have* (like generics and Go, or
memory safety and C :p ).
Yes, on some level you are right... but again, who's to judge?
[...]
Ultimately, Walter and Andrei, as AFAIK they decide what gets
into the language.