HaloO, Luke Palmer wrote:
But junctions are so "special", that this abstraction wouldn't work.
Well my point is that I dought that it is tractible for the compiler to come up with the dwimmery to pull the invocation of .values on the return value out of toomany and leave the cardinality check in, depending on the input beeing a junction or not. That is to re-write the junctive case to: sub toomany($j) { if $j > 4 { die "Too many values" } return $j; } my ($j,$k,$l) = map { toomany(get_junction().values) } ^3; Sort of a junctive compile :) My syntactical idea is that this might work if you naively write it without explicit .value and a junction numerifying to its cardinality. To get the junctive behaviour to trigger as you outline would be sub toomany(&j) # or perhaps even *&j { if j > 4 { die "Too many values" } return &j; } while an &j.values in the if still keeps the junction untriggered. Then an uncertain j.values might result in the compile time warning "return value of j not necessarily a junction". The only remaining thing for @Larry is to define an exclusive set of methods for querying junctions. Sort of like the "handle with care" stickers on containers!
Yes, yes, you say "oh, just put a 'Junction' type on the $j parameter". That's not the point. The point is that now I have to be aware whenever I'm using a junction, and I have to declare all my generics as being able to take junctions, instead of it just being obvious that they do because I didn't put any constraints on the values. I figure that if you don't have any constraints on the values, you're not assuming anything about them, so using a junction should be fine.
Could it be that you think about it the wrong way around? I mean the values should be in the object grammar slot: "..., so a junction using *them* should be fine."
Of course, this was introduced for a reason: sub min($x,$y) { $x <= $y ?? $x !! $y } sub min2($x, $y) { if $x <= $y { return $x } if $x > $y { return $y } } In the presence of junctions, these two functions are not equivalent.
Neither are they in the presence of references min(\3,{\3}) === min2({\3},\3) when the outer non-closure refs are created and optimized away by the compiler and the inner ones later depending on the order in which === evaluates its args. I mean that it sees 3 === \3 and returns false. Well, or I don't understand === ;)
In fact, it is possible that both or neither of the conditionals succeed in min2(), meaning you could change the order of the if statements and it would change the behavior. This is wacky stuff, so we said that you have to be aware that you're using a junction for "safety".
But control is outside. Junctions help people to obfuscate things like if min2(1,0) && min2(0,1) {die} where the inputs to && are returned from different spots in min2.
I figure that if something says it's totally ordered (which junctions do simply by being allowed arguments to the <= function), both of these functions should always be the same. The fact is that junctions are not totally ordered, and they shouldn't pretend that they are.
Junctions aren't ordered totally of course. But neither are they arguments to <=. The reverse is true: <= is an argument to the junction. And my point originally was about inventing a pleasing syntax that highlights this fact. E.g. the Complex type might support <= such that it returns true in case of equality and undef otherwise. The latter might lead people to regard it as false and hopefully cross-check with > and then draw the right conclusion after receiving another undef and promoting it to false. Namely that the net effect was a negated equality check! A gray view of a picture emerges from a blurred array of black and white pixels, and it takes more than two primal constituents for a color picture to emerge, e.g. black, white and fast motion with the right patterns deceive the human eye into perceiving color! Or think of contexts where 5 is not considered prime! Is it always? If not, in which context? The real objective of a type system is to quench undefinedness out of the essential parts of a program.
The other thing that is deeply disturbing to me, but apparently not to many other people, is that I could have a working, well-typed program with explicit annotations. I could remove those annotations and the program would stop working!
The thing that haunts me is that junctions make only sense if they bubble up into the control heaven so far that to achieve invariance a *complete* program has to be run for all conceivable permutations of pathes through the call graph! Might be nice for testing if given enough time, though :)
In the literature, the fact that a program is well-typed has nothing to do with the annotations in the program; they're there for redundancy, so they can catch you more easily when you write a poorly-typed program. But in order to use junctions, using and not using annotations can both produce well-typed programs, *and those programs will be different*.
That depends on your definition of different. Any UNIX process is easily described as: "running until it has determined its exit code which is then returned to its parent process". This e.g. covers network buffer overflow attacks that depending on their nature suddenly make the program trying to crack the root password! Well, or think of the wonderfull hard contraint that a 32bit processor will never access memory outside of the 4GB range ;) There are two sides of correctness: the spec and the implementation. No spec at all makes any program correct. Contradicting specs make any implementation fail for one or the other reason. In other words there are also incorrect specs. But that we know since Gödel ;) --