On Tue, Feb 17, 2015 at 3:17 AM, Geoffrey Irving <[email protected]> wrote: > On Tuesday, February 17, 2015, Jonathan S. Shapiro <[email protected]> wrote: >> >> On Mon, Feb 16, 2015 at 12:54 PM, Geoffrey Irving <[email protected]> wrote: >>> >>> On Mon, Feb 16, 2015 at 12:49 PM, Jonathan S. Shapiro <[email protected]> >>> wrote: >>> > >>> > In this scenario, we have adequate information at the definition site >>> > and in >>> > the written form of the type to determine what the arity must be, but >>> > we do >>> > NOT have enough information at the application site. Given a procedure: >>> > >>> > def perverse f a b = f a b >>> > >>> > we cannot determine whether f has type fn 'a -> (fn 'b -> 'c) or >>> > alternatively has type fn 'a 'b -> 'c >>> >>> It might be reasonable to extend this as you say to support automatic >>> conversion, but it does introduce a weird asymmetry where *fewer* >>> arguments are disallowed but *more* arguments are fine. If more is >>> fundamentally easy to implement and understand, while fewer is not, >>> this asymmetry might be reasonable, but it seems odd. >> >> >> I'm not seeing that. However the application is parenthesized due to >> arity, the number of arguments present still must satisfy the function type. >> >> Actually, I think you may have it backwards. If we admit this peculiar >> form of generalization, *fewer* arguments are OK. In particular: >> >> def f x y = lambda z { return x + y + z; } >> >> can be applied correctly as either "f u v" or "f u v w". The requirement >> would be that all applications must satisfy the natural arity of the >> function being applied; we do not inject lambdas implicitly. > > > That's saying the same thing. Not sure why one of us uses fewer for the > other's more, but we don't disagree. Maybe a contravariance / covariance > thing? :) > >> >> I grant that that outcome may have puzzling consequences for the user, so >> perhaps it will turn out not to be wise. I'm just saying I think it can be >> done. > > > I agree that the asymmetry is real. The effect of "fewer" (in my > terminology) is allocation, the effect of "more" is a somewhat slower > calling convention. Allocation is the worse of the two options. > > On the other hand, in my experience fewer (automatic lambdas) is used far > more often than more (multiple applications). Thus, I'd guess that the > benefit of allowing implicit more may not be enough to justify > implementation complexity and the need to explain magic to systems > programmers.
I agree. Also, I don't like how (f u v w) would be inferred as a 3-application, but it can also be a 2-application then a 1-application, if f's arity is known. This whole issue of asymmetry and arbitrary inference doesn't come up in Shap's new arity specialization proposal, right? The type inference and checking would all be with pre-specialized, arity-agnostic types, right? _______________________________________________ bitc-dev mailing list [email protected] http://www.coyotos.org/mailman/listinfo/bitc-dev
