On Wed, Aug 28, 2013 at 3:39 AM, Bennie Kloosteman <[email protected]>wrote:
> +1 for typed typedefs .. too many ints in C it would be very usefull even > in business apps to have say typedef int CustomerKey ; will stop a lot of > bugs when you use the wrong foreign key in CRUD code. > >From a language design perspective, typedefs are an interesting issue. The question is: does a typedef introduce a new, distinct type? If so, then we're really talking about NewType, and all of the operators on the underlying type do not work on the new type (unless suitably extended). The type rules for that extension aren't always simple. If not, then "typedef" should properly be named "typealias". Typealias introduces challenges for inference. When we report a type that has a type alias, should we write it using the typealias or the original type? There is no universally right answer to that question. Typealias and NewType both have their purposes. > Enums im mixed on , using them for C# is normally a problem but for > interop and using [Flags] its gold , as well as the fact you need them for > many .NET standlibs so id leave them . Maybe just add an ENum type in the > new standard lib but leave enums for interop. > > Speaking of which will you retain compatability with existing .NET libs > ... ? > Since .NET isn't the goal, who cares? > Thanks for the suggestion. We've considered emiting tail call instructions > at a number of points in the development of the C# compiler. However, there > are some subtle issues which have pushed us to avoid this so far: 1) There > is actually a non-trivial overhead cost to using the .tail instruction in > the CLR (it is not just a jump instruction as tail calls ultimately become > in many less strict environments such as functional language runtime > environments where tail calls are heavily optimized). 2) There are few real > C# methods where it would be legal to emit tail calls (other languages > encourage coding patterns which have more tail recursion, and many that > rely heavily on tail call optimization actually do global re-writing (such > as Continuation Passing transformations) to increase the amount of tail > recursion). 3) Partly because of 2), cases where C# methods stack overflow > due to deep recursion that should have succeeded are fairly rare." > In short, they haven't done the work, so people don't use the under-developed feature, so the demand for the work doesn't exist.
_______________________________________________ bitc-dev mailing list [email protected] http://www.coyotos.org/mailman/listinfo/bitc-dev
