On Mon, Jul 29, 2013 at 2:12 AM, William ML Leslie <
[email protected]> wrote:

> The most obvious way, it seemed, to talk formally about the signature
> was to use a common language to represent working with typeclasses.
> If that is System F_c (which seemed a good choice) then any time
> you've got parametric types or constraints you've got type-level
> arguments.
>

Fair enough, and I agree that makes sense.


> > At run time, you'll actually be calling a specialization of sort that
> takes
> > a single argument: the array.
>
> And possibly the OrdDict if you haven't specialised for the particular 'a?
>

In a language with unboxed types, not specializing for 'a is rarely
feasible. You can do partial specializations if all you are doing with that
parameter is carrying the pointer around or copying it, but the minute you
do an operation on that object you need to know things like field offsets,
and you *really* don't want to be pulling field offsets out of a dictionary
at run time. At that point even a really stupid JIT win.


>
> > Conceptually, the selection of instance should be documented in every
> case.
> > But when the instance selected satisfies the "cT or cR or prologue" rule,
> > everybody knows where to look for it, and we can say "in the absence of a
> > named resolution, we understand it to mean that the type class was
> resolved
> > using whatever instance was appropriate under the "cT or cR or prologue"
> > rule. That is: we're dropping the annotation purely as a relief on the
> poor
> > user's eyes. Logically it is still present.
> >
> > I have a memory that when two sides of a procedure call disagree about
> the
> > default instance, soundness problems can arise. That is: the
> consequences of
> > instance incoherence are worse than just awkwardness and
> > incomprehensibility. This is why it is so important, when the two sides
> > might disagree, to make it possible for them to check. Unfortunately I
> can't
> > reconstruct the bad example, and I can't remember which paper raised the
> > issue.
>
> Ok, well now I see where you're coming from.  I'll have to see what I
> can find on the subject (because obviously what I want is not to care
> about your instance of Ord R).
>

Forgetting the type issue for a moment, there's no way you can "not care".
When you call a function, you expect it's behavior to have a sensible
meaning (in the human sense). In practice, this means that either (a) you
supply that meaning explicitly as an argument, or (b) you know what the
meaning is by virtue of shared conventions.

Case (a) calls for explicit instances, and I think it's a fine thing for a
language to support explicit instances.
Case (b) is the reason that you need a shared understanding of the rules
for automatic instance resolution.


> I /don't/ think there is anything interesting about the idea of a
> default instance.  Intuitively, inferring the instance lexically feels
> like generic functions (in the lisp, not the cli, sense) or extension
> methods; but I don't think it would be a huge burden to always name
> them. (Maybe this is just a side-effect of having to write C# and Java
> for many years?)
>

I think you'd be *staggered* to see just how many instances are required if
you can't elide them. We had a switch we could flip in the BitC compiler at
one point that told it *not* to resolve instances eagerly so we could see
what they looked like. It's not at all unusual for a relatively
innocuous-looking procedure to need 40 instance resolutions.

As to the rest, sure, except of course that if instance resolution occurred
lexically we wouldn't have an instance coherence problem in the first place.


shap
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to