RE: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Simon Peyton Jones
| > Are these three technical capabilities *all* that we would need?
| > Perhaps
| > we also need a way to tie the current language (-XHaskell98,
| > -XHaskell2010) to a particular implementation of the Prelude.
| >
| >
| > I don't have a concrete plan here. I'm not even sure one can be
| > achieved that works. I'd say that the burden of figuring out such a
| > thing falls on the party that can create a plan, pitch it to the
| > community and potentially implement it.

In fact there is more than one concrete plan: 
https://ghc.haskell.org/trac/ghc/wiki/IntrinsicSuperclasses

All are complex, only partially designed, entirely unimplemented (and the 
implementation will be non-trivial), and lacking an active champion.  The one I 
link to above is probably the leading contender, but it feels too complicated 
to me.

Simon

___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Geoffrey Mainland
On 10/22/2015 02:25 PM, Edward Kmett wrote:
>
> On Thu, Oct 22, 2015 at 1:41 PM, Geoffrey Mainland
> mailto:mainl...@apeiron.net>> wrote:
>
> On 10/22/2015 01:29 PM, Edward Kmett wrote:
> > On Thu, Oct 22, 2015 at 12:59 PM, Geoffrey Mainland
> > mailto:mainl...@apeiron.net>
> >> wrote:
> >
> >
> > I am not against changing the Prelude! But it sure would be
> nice if
> > -XHaskell98 gave me a Haskell 98 Prelude and -XHaskell2010
> gave me a
> > Haskell 2010 Prelude, both of which could be used with external
> > packages
> > that themselves used the more modern Prelude.
> >
> >
> > It would definitely be a preferable state of affairs. Unfortunately,
> > at least with the tools available to us today, such a plan is
> > incompatible with any plan that introduces a new superclass. It also
> > cuts off plans that ever factors an existing class into two, such as
> > the MonadFail proposals. We simply do not at this time have the
> > technical capabilities that would support such a system. If they
> > showed up in GHC we can adapt plans to fit.
>
> Great!
>
> Could we work to characterize what technical capabilities we would
> need
> to support full backwards Prelude compatibility?
>
> Here is my rough understanding of what we would need:
>
> 1) Some method for "default superclasses." This would solve the
> AMP issue.
>
> 2) A method for factoring existing classes into two (or more) parts.
> This would solve the MonadFail problem.
>
> 3) A method for imposing extra superclass constraints on a class. This
> would be needed for full Num compatibility. Seems much less important
> that 1 and 2.
>
> The most thought has gone into 1.
>
>
> Are these three technical capabilities *all* that we would need?
> Perhaps
> we also need a way to tie the current language (-XHaskell98,
> -XHaskell2010) to a particular implementation of the Prelude.
>
>  
> I don't have a concrete plan here. I'm not even sure one can be
> achieved that works. I'd say that the burden of figuring out such a
> thing falls on the party that can create a plan, pitch it to the
> community and potentially implement it.
>
> If I enumerate a set of conditions here I'm basically implying that
> I'd agree to any plan that incorporated them. I'm just not prepared to
> make that commitment sight-unseen to something with unknown warts and
> implications.
>
> I can, however, say that it is plausible that what you have enumerated
> above could potentially address the outstanding issues, but I don't
> know how good of a compromise the result would be. 
>
> -Edward

I don't have a concrete plan either, not am I sure that one is possible.
But I don't see how having a conversation about how one might achieve
backwards compatibility would commit anyone to anything. Any eventual
proposal would have to go through the same approval process as every
other proposal. And even if we did have a hypothetical draft proposal
that you had at some point stated you approved of in some way, you would
always be free to change your mind!

Isn't the libraries list exactly where this sort of conversation should
happen?

Cheers,
Geoff
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Edward Kmett
On Thu, Oct 22, 2015 at 1:37 PM, Gregory Collins 
wrote:

>
> On Wed, Oct 21, 2015 at 11:40 PM, Edward Kmett  wrote:
>
>> All I'm saying is that if we want to appeal to or cater to working
>>> software engineers, we have to be a lot less cavalier about causing more
>>> work for them, and we need to prize stability of the core infrastructure
>>> more highly. That'd be a broader cultural change, and that goes beyond
>>> process: it's policy.
>>>
>>
>> The way things are shaping up, we've had 17 years of rock solid stability
>>
>
> I have >95% confidence that all of the C++ programs I wrote 15 years ago
> would build and work if I dusted them off and typed "make" today. I have
> Haskell programs I wrote last year that I probably couldn't say that about.
>
> So I don't buy that, at all, at least if we're discussing the topic of the
> stability of the core infrastructure in general rather than changes being
> made to the Prelude. It's been possible to write to Haskell 98 without too
> much breakage, yes, but almost nobody actually does that; they write to
> Haskell as defined by GHC + the boot libraries + Haskell platform +
> Hackage, IMO with decreasing expectations of stability for each. The core
> set breaks a lot.
>

I definitely agree here.

We have a lot of libraries in the Haskell Platform that have fairly liberal
change policies. On the other hand, we have a policy of "maintainer
decides" around issues. This yields a fairly decentralized change
management process, with different maintainers who have different views.
The Platform gives us a central pool of packages that are generally the
"best of breed" in their respective spaces, but gives us few stability
guarantees.

Heck, every release I wind up having to change whatever code I have that
uses template-haskell or Typeable.

On the other hand, it isn't clear with a larger "core" platform with harder
stability guarantees that we have a volunteer force that can and would sign
up for the long slog of maintenance without that level of autonomy.


> We definitely shouldn't adopt a posture to breaking changes as
> conservative as the C++ committee's, and literally nobody in the Haskell
> community is arguing against breaking changes in general, but as I've
> pointed out, most of these breakages could have been avoided with more
> careful engineering, and indeed, on many occasions the argument has been
> made and it's fallen on deaf ears.
>

I would argue that there are individual maintainers that give lie to that
statement. In many ways Johan himself has served as a counter-example
there. The libraries he has maintained have acted as a form of bedrock with
long maintenance windows. On the other hand, the burden of maintaining that
stability seems to have ultimately burned him out.

They can speak for themselves but I think for Mark and Johan, this is a
> "straw that broke the camel's back" issue rather than anything to do with
> the merits of removing return from Monad. I think the blowback just happens
> to be so much stronger on MRP because the breaking change is so close to
> the core of the language, and the benefits are so nebulous. fixing an
> aesthetic problem has almost zero practical value
>

I personally don't care about the return side of the equation.

Herbert's MRP proposal was an attempt by him to finish out the changes
started by AMP so that a future Haskell Report can read cleanly. Past
reports have been remarkably free of historical baggage.

I'd personally readily sacrifice "progress" there in the interest of
harmony. Herbert as haskell-prime chair possibly feels differently.


> and ">> could be slightly more efficient for some monads" is pretty weak
> sauce.
>

The issue right now around (>>) is that it has knock-on effects that run
pretty far and wide. "Weak sauce" or not, it means through second order
consequences that we can't move the useless mapM and sequence to the top
level from their current status as memberrs of Traversable and that users
have to care about which of two provably equivalent things that they are
using, at all times.

It means that code that calls mapM will be less efficient and that mapM_
behaves in a manner with rather radically different space and time behavior
than mapM today and not in a consistently good way.

-Edward
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Edward Kmett
On Thu, Oct 22, 2015 at 1:41 PM, Geoffrey Mainland 
wrote:

> On 10/22/2015 01:29 PM, Edward Kmett wrote:
> > On Thu, Oct 22, 2015 at 12:59 PM, Geoffrey Mainland
> > mailto:mainl...@apeiron.net>> wrote:
> >
> >
> > I am not against changing the Prelude! But it sure would be nice if
> > -XHaskell98 gave me a Haskell 98 Prelude and -XHaskell2010 gave me a
> > Haskell 2010 Prelude, both of which could be used with external
> > packages
> > that themselves used the more modern Prelude.
> >
> >
> > It would definitely be a preferable state of affairs. Unfortunately,
> > at least with the tools available to us today, such a plan is
> > incompatible with any plan that introduces a new superclass. It also
> > cuts off plans that ever factors an existing class into two, such as
> > the MonadFail proposals. We simply do not at this time have the
> > technical capabilities that would support such a system. If they
> > showed up in GHC we can adapt plans to fit.
>
> Great!
>
> Could we work to characterize what technical capabilities we would need
> to support full backwards Prelude compatibility?
>
> Here is my rough understanding of what we would need:
>
> 1) Some method for "default superclasses." This would solve the AMP issue.
>
> 2) A method for factoring existing classes into two (or more) parts.
> This would solve the MonadFail problem.
>
> 3) A method for imposing extra superclass constraints on a class. This
> would be needed for full Num compatibility. Seems much less important
> that 1 and 2.
>
> The most thought has gone into 1.
>

> Are these three technical capabilities *all* that we would need? Perhaps
> we also need a way to tie the current language (-XHaskell98,
> -XHaskell2010) to a particular implementation of the Prelude.
>

I don't have a concrete plan here. I'm not even sure one can be achieved
that works. I'd say that the burden of figuring out such a thing falls on
the party that can create a plan, pitch it to the community and potentially
implement it.

If I enumerate a set of conditions here I'm basically implying that I'd
agree to any plan that incorporated them. I'm just not prepared to make
that commitment sight-unseen to something with unknown warts and
implications.

I can, however, say that it is plausible that what you have enumerated
above could potentially address the outstanding issues, but I don't know
how good of a compromise the result would be.

-Edward
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Bardur Arantsson
On 10/22/2015 07:41 PM, Geoffrey Mainland wrote:
> On 10/22/2015 01:29 PM, Edward Kmett wrote:
>> On Thu, Oct 22, 2015 at 12:59 PM, Geoffrey Mainland
>> mailto:mainl...@apeiron.net>> wrote:
>>  
>>
>> I am not against changing the Prelude! But it sure would be nice if
>> -XHaskell98 gave me a Haskell 98 Prelude and -XHaskell2010 gave me a
>> Haskell 2010 Prelude, both of which could be used with external
>> packages
>> that themselves used the more modern Prelude. 
>>
>>
>> It would definitely be a preferable state of affairs. Unfortunately,
>> at least with the tools available to us today, such a plan is
>> incompatible with any plan that introduces a new superclass. It also
>> cuts off plans that ever factors an existing class into two, such as
>> the MonadFail proposals. We simply do not at this time have the
>> technical capabilities that would support such a system. If they
>> showed up in GHC we can adapt plans to fit.
> 
> Great!
> 
> Could we work to characterize what technical capabilities we would need
> to support full backwards Prelude compatibility?
> 

It's basically the stuff that never materialized in 10 years until
people got fed up with the situation and voted AMP through even though
it would cause (limited) breakage.

> Here is my rough understanding of what we would need:
> 
> 1) Some method for "default superclasses." This would solve the AMP issue.
> 
> 2) A method for factoring existing classes into two (or more) parts.
> This would solve the MonadFail problem.
> 
> 3) A method for imposing extra superclass constraints on a class. This
> would be needed for full Num compatibility. Seems much less important
> that 1 and 2.
> 
> The most thought has gone into 1.
> 
> Are these three technical capabilities *all* that we would need? Perhaps
> we also need a way to tie the current language (-XHaskell98,
> -XHaskell2010) to a particular implementation of the Prelude.
> 

You say "all" as if a) it's easy (hint: it's all highly non-trivial), b)
anybody's actually going to do the work.

Look, we'd all like unicorns and rainbows, but clearly nobody's done the
required work[1], and a lot of people were getting fed up with the
status quo.

Just wishing that this will happen won't make it so, and frankly,
downplaying the difficulty seems like an attempt to veto any change.

Regards,

[1] Understandable, given the highly non-trivial nature of it.

___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Geoffrey Mainland
On 10/22/2015 01:29 PM, Edward Kmett wrote:
> On Thu, Oct 22, 2015 at 12:59 PM, Geoffrey Mainland
> mailto:mainl...@apeiron.net>> wrote:
>  
>
> I am not against changing the Prelude! But it sure would be nice if
> -XHaskell98 gave me a Haskell 98 Prelude and -XHaskell2010 gave me a
> Haskell 2010 Prelude, both of which could be used with external
> packages
> that themselves used the more modern Prelude. 
>
>
> It would definitely be a preferable state of affairs. Unfortunately,
> at least with the tools available to us today, such a plan is
> incompatible with any plan that introduces a new superclass. It also
> cuts off plans that ever factors an existing class into two, such as
> the MonadFail proposals. We simply do not at this time have the
> technical capabilities that would support such a system. If they
> showed up in GHC we can adapt plans to fit.

Great!

Could we work to characterize what technical capabilities we would need
to support full backwards Prelude compatibility?

Here is my rough understanding of what we would need:

1) Some method for "default superclasses." This would solve the AMP issue.

2) A method for factoring existing classes into two (or more) parts.
This would solve the MonadFail problem.

3) A method for imposing extra superclass constraints on a class. This
would be needed for full Num compatibility. Seems much less important
that 1 and 2.

The most thought has gone into 1.

Are these three technical capabilities *all* that we would need? Perhaps
we also need a way to tie the current language (-XHaskell98,
-XHaskell2010) to a particular implementation of the Prelude.

Geoff
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Edward Kmett
On Thu, Oct 22, 2015 at 12:59 PM, Geoffrey Mainland 
wrote:

>
> > I outlined one possible path to avoid this kind of issue: spend more
> > time thinking about ways to maintain compatibility. We had
> > proposals for
> > doing this with AMP.
> >
> >
> > And on the other hand we also had a concrete proposal that didn't
> > require language changes that was ridiculously popular. People had
> > been talking about Applicative as a superclass of Monad for a decade
> > before we finally acted upon the AMP. People had been talking about
> > superclass defaulting for a decade. When do you cut off discussion and
> > ship the proposal that has overwhelming support? If there is no
> > process that enables this you can stall the process indefinitely by
> > raising objections of this form. Such a situation is not without costs
> > all its own.
> >
>
> I agree. It was certainly within the power of the committee to start a
> clock and say something like "if we don't have a patch to GHC that
> provides backwards compatibility for AMP within 1 year, we will push out
> AMP as-is." Had I understand the implications of AMP at the time, or
> even been aware that AMP was happening (I was actually actively working
> on the GHC code base during that period), that certainly would have been
> motivation for me to do something about it! *That* would be how one
> could cut off discussion and ship a proposal.
>

I freely admit that there is room for improvement in the process. We're all
learning here.

The current Semigroup-Monoid proposal more or less fits the bill you are
looking for here. We have a roadmap today that migrates an existing package
with several years worth of back support into base more or less unmodified,
and then in 3 releases starts requiring instances. You can think of that 3
release clock as precisely what you are looking for here.

If we get an implementation of superclass defaulting or some other
mechanism that can mitigate the extra couple of lines of code that this
proposal will tax users with, within that timeline, we'd gladly incorporate
it into the proposal.


> I am not against changing the Prelude! But it sure would be nice if
> -XHaskell98 gave me a Haskell 98 Prelude and -XHaskell2010 gave me a
> Haskell 2010 Prelude, both of which could be used with external packages
> that themselves used the more modern Prelude.


It would definitely be a preferable state of affairs. Unfortunately, at
least with the tools available to us today, such a plan is incompatible
with any plan that introduces a new superclass. It also cuts off plans that
ever factors an existing class into two, such as the MonadFail proposals.
We simply do not at this time have the technical capabilities that would
support such a system. If they showed up in GHC we can adapt plans to fit.


> Maybe that's impossible.
> Setting a firm deadline to finding a solution to the compatibility issue
> would have been a way to compromise. Ideally, changing the Prelude
> wouldn't require breaking code written to use an older version of the
> Prelude. Yes, attaining that goal would require more work.
>

We looked around for a year for a roadmap that would get us there. None
presented itself. In the end we wound up shedding the core libraries status
of the haskell98 and haskell2010 packages as the 3-4 different ways in
which one could write a Haskell2010 package all have different trade-offs
and can be maintained in user-land.

Examples:

* A hardline version of haskell2010 with a Monad and Num that fully
complies with the report, but which doesn't work with Monad and Num
instances supplied by other libraries. This needs RebindableSyntax, so it
doesn't quite work right. With compiler support for rebinding syntax to a
particular library instead of to whatever is in scope, such a thing might
be suitable for teaching a Haskell class.

* A pragmatic haskell2010 where the Monad has an Applicative superclass and
Num has the current semantic. This works with everything but doesn't
faithfully follow the report.

*  A middle-ground package that tries to use a superclass defaulting
mechanism that we don't have to supply missing Applicative superclasses
might resolve the Applicative-Monad issue in theory, but does nothing for
report compliance of our existing Num.

Each one of these solutions has flaws. Two of them require innovations in
the compiler that we don't have.


> Evolving the Prelude and maintaining compatibility are not necessarily
> mutually exclusive options.
>

Agreed, but as you can see above, maintaining compatibility isn't
necessarily always a viable option either.

-Edward
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Geoffrey Mainland

> I outlined one possible path to avoid this kind of issue: spend more
> time thinking about ways to maintain compatibility. We had
> proposals for
> doing this with AMP.
>
>
> And on the other hand we also had a concrete proposal that didn't
> require language changes that was ridiculously popular. People had
> been talking about Applicative as a superclass of Monad for a decade
> before we finally acted upon the AMP. People had been talking about
> superclass defaulting for a decade. When do you cut off discussion and
> ship the proposal that has overwhelming support? If there is no
> process that enables this you can stall the process indefinitely by
> raising objections of this form. Such a situation is not without costs
> all its own.
>

I agree. It was certainly within the power of the committee to start a
clock and say something like "if we don't have a patch to GHC that
provides backwards compatibility for AMP within 1 year, we will push out
AMP as-is." Had I understand the implications of AMP at the time, or
even been aware that AMP was happening (I was actually actively working
on the GHC code base during that period), that certainly would have been
motivation for me to do something about it! *That* would be how one
could cut off discussion and ship a proposal.

I am not against changing the Prelude! But it sure would be nice if
-XHaskell98 gave me a Haskell 98 Prelude and -XHaskell2010 gave me a
Haskell 2010 Prelude, both of which could be used with external packages
that themselves used the more modern Prelude. Maybe that's impossible.
Setting a firm deadline to finding a solution to the compatibility issue
would have been a way to compromise. Ideally, changing the Prelude
wouldn't require breaking code written to use an older version of the
Prelude. Yes, attaining that goal would require more work.

Evolving the Prelude and maintaining compatibility are not necessarily
mutually exclusive options.

Cheers,
Geoff
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Edward Kmett
On Thu, Oct 22, 2015 at 12:20 PM, Mario Blažević 
wrote:

> On 15-10-22 09:29 AM, Geoffrey Mainland wrote:
>
>> ...
>>
>> 1) What is the master plan, and where is it documented, even if this
>> document is not up to the standard of a proposal? What is the final
>> target, and when might we expect it to be reached? What is in the
>> pipeline after MRP?
>>
>> Relatedly, guidance on how to write code now so that it will be
>> compatible with future changes helps mitigate the stability issue.
>>
>
> I have been fully in favour of all the proposals implemented so
> far, and I think that having an explicit master plan would be a great idea.
> It would address some of the process-related objections that have been
> raised, and it would provide a fixed long-term target that would be much
> easier to make the whole community aware of and contribute to.
>
> For that purpose, the master plan should be advertised directly on
> the front page of haskell.org. Once we have it settled and agreed, the
> purpose of the base-library commitee would essentially become to figure out
> the details like the timeline and code migration path. One thing they
> wouldn't need to worry about is whether anybody disagrees with their goals.
>
>
> 2) How can I write code that makes use of the Prelude so that it will
>> work with every new GHC release over the next 3 years? 5 years? For
>> example, how can I write a Monad instance now, knowing the changes that
>> are coming, so that the instance will work with every new GHC release
>> for the next 3 years? 5 years? If the answer is "you can't," then when
>> might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to
>> say I don't know the answer!
>>
>
> From the discussions so far it appears that the answer for 3 years
> (or at least the next 3 GHC releases) would be to write the code that works
> with the current GHC and base, but this policy has not been codified
> anywhere yet. Knowing the upcoming changes doesn't help with making your
> code any more robust, and I think that's a shame. We could have a
> two-pronged policy:
>
> - code that works and compiles with the latest GHC with no *warnings* will
> continue to work and compile with no *errors* with the following 2
> releases, and
> - code that also follows the forward-compatibility recommendations current
> for that version of GHC will continue to work and compile with no *errors*
> with the following 4 releases.
>

We have adopted a "3 release policy" facing backwards, not forwards.
However, all proposals currently under discussion actually meet a stronger
condition, a 3 release policy that you can slide both forward and backwards
to pick the 3 releases you want to be compatible with without using CPP. It
also appears that all of the changes that we happen to have in the wings

https://ghc.haskell.org/trac/ghc/wiki/Status/BaseLibrary

comply with both of your goals here. However, I hesitate to say that we can
simultaneously meet this goal and the 3 release policy facing backwards
_and_ sufficient notification in all situations even ones we can't foresee
today. As a guideline? Sure. If we have two plans that can reach the same
end-goal and one complies and the other doesn't, I'd say we should favor
the plan that gives more notice and assurance. However, this also needs to
be tempered against the number of years folks suffer the pain of having in
an inconsistent intermediate state. (e.g. having generalized combinators in
Data.List today)

The forward-compatibility recommendations would become a part of
> the online GHC documentation so nobody complains they didn't know about
> them. Personally, I'd prefer if the recommendations were built into the
> compiler itself as a new class of warnings, but then (a) some people would
> insist on turning them on together with -Werror and then complain when
> their builds break and (b) this would increase the pressure on GHC
> implementors.


The current discussion is centering around adding a -Wcompat flag that
warns of changes that you maybe can't yet implement in a way that would be
backwards compatible with a 3 release backwards-facing window, but which
will eventually cause issues.

-Edward
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Edward Kmett
On Thu, Oct 22, 2015 at 11:36 AM, Geoffrey Mainland 
wrote:

> On 10/22/2015 11:02 AM, Matthias Hörmann wrote:
> > I would say that the need to import Control.Applicative in virtually
> > every module manually
> > definitely caused some pain before AMP.
>
> In this particular case, there is a trade off between breaking code on
> the one hand and having to write some import statements on the other. I
> find writing some extra imports less painful than breaking (other
> people's and my) code, but the other position is defensible as well. I
> sense that I am in the minority, at least on the libraries list.
>
> > I would also argue that a
> > non-negligible amount
> > of effort goes into teaching the warts, the reasons for the warts and
> > how to work around them.
>
> Which wart(s) in particular? All of them? Does having return (and (>>))
> in Monad make teaching more difficult?
>

Having (>>) means that we have hundreds of monads out there where (>>) has
been optimized, but (*>) has not.

If I were working alone, AMP wouldn't be a huge deal. I could fix the
> code for 7.10 compatibility, but then unless everyone switches to 7.10,
> changes to the codebase made by someone using 7.8, e.g., defining a new
> Monad instance, could break things on 7.10 again. It's easier to stick
> with 7.8. Any time spent dealing with compatibility issues is time not
> spent writing actual code.
>

In the open source world many of us just fire off our code to travis-ci and
get it to build with a dozen different compiler versions. I maintain a lot
of code that supports things back to 7.0 and forward to HEAD this way.


> I outlined one possible path to avoid this kind of issue: spend more
> time thinking about ways to maintain compatibility. We had proposals for
> doing this with AMP.
>

And on the other hand we also had a concrete proposal that didn't require
language changes that was ridiculously popular. People had been talking
about Applicative as a superclass of Monad for a decade before we finally
acted upon the AMP. People had been talking about superclass defaulting for
a decade. When do you cut off discussion and ship the proposal that has
overwhelming support? If there is no process that enables this you can
stall the process indefinitely by raising objections of this form. Such a
situation is not without costs all its own.

-Edward


> Cheers,
> Geoff
>
> >
> > On Thu, Oct 22, 2015 at 3:29 PM, Geoffrey Mainland 
> wrote:
> >> On 10/22/2015 02:40 AM, Edward Kmett wrote:
> >>> On Wed, Oct 21, 2015 at 8:42 PM, Gregory Collins
> >>> mailto:g...@gregorycollins.net>> wrote:
> >>>
> >>>
> >>> On Wed, Oct 21, 2015 at 3:18 PM, Geoffrey Mainland
> >>> mailto:mainl...@apeiron.net>> wrote:
> >>>
> >>> My original email stated my underlying concern: we are losing
> >>> valuable
> >>> members of the community not because of the technical
> >>> decisions that are
> >>> being made, but because of the process by which they are being
> >>> made.
> >>>
> >>> [If] you're doing research you're on the treadmill, almost by
> >>> definition, and you're delighted that we're finally making some
> >>> rapid progress on fixing up some of the longstanding warts.
> >>>
> >>> If you're a practitioner, you are interested in using Haskell for,
> >>> y'know, writing programs. You're probably in one of two camps:
> >>> you're in "green field" mode writing a lot of new code (early
> >>> stage startups, prototype work, etc), or you're
> >>> maintaining/extending programs you've already written that are out
> >>> "in the field" for you doing useful work. Laura Wingerd calls this
> >>> the "annealing temperature" of software, and I think this is a
> >>> nice metaphor to describe it. How tolerant you are of ecosystem
> >>> churn depends on what your temperature is: and I think it should
> >>> be obvious to everyone that Haskell having "success" for
> >>> programming work would mean that lots of useful and correct
> >>> programs get written, so everyone who is in the former camp will
> >>> cool over time to join the latter.
> >>>
> >>>
> >>> I've made the point before and I don't really want to belabor it:
> >>> our de facto collective posture towards breaking stuff, especially
> >>> in the past few years, has been extremely permissive, and this
> >>> alienates people who are maintaining working programs.
> >>>
> >>>
> >>> Even among people who purported to be teaching Haskell or using
> >>> Haskell today in industry the margin of preference for the concrete
> >>> FTP proposal was ~79%. This was considerably higher than I expected in
> >>> two senses. One: there were a lot more people who claimed to be in one
> >>> of those two roles than I expected by far, and two: their appetite for
> >>> change was higher than I expected. I initially expected to see a
> >>> stronger "academic vs. industry" split in the poll, but the groups

Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Edward Kmett
On Thu, Oct 22, 2015 at 9:29 AM, Geoffrey Mainland 
wrote:

> Thanks to you and Dan [1], I now have a greater understanding and
> appreciation for where the committee has been coming from. My new
> understanding is that the changes that were formalized in AMP, FTP, and
> MRP were the basis for the committee's creation. It also seems that
> there are more changes in the pipeline that have not yet been made into
> proposals, e.g., pulling (>>) out of Control.Monad [2]. Part of
> "stability" is signaling change as far ahead as possible. The committee
> has put a lot of effort into this, which I appreciate! However, as each
> of these proposal has come down the pipeline, I never realized that they
> were part of a larger master plan.
>

The "master plan" where (>>) is concerned is that it'd be nice to get
Traversable down to a minimal state and to eliminate unnecessary
distinctions in the Prelude between things like mapM and traverse. Right
now they have different type constraints, but this is entirely a historical
artifact. But it causes problems, we have a situation where folks have
commonly optimized (>>) but left (*>) unfixed. This yields different
performance for mapM_ and traverse_. A consequence of the AMP is that the
neither one of those could be defined in terms of the other (*>) has a
default definition in terms of (<*>). (>>) has a default definition in
terms of (>>=). With two places where optimizations can happen and two
different definitions for operations that are logically required to be the
same thing we can and do see rather radically different performance between
these two things.

This proposal is something that was put out as a sort of addendum to the
Monad of No Return proposal for discussion, but unlike MRP has no
particular impact on a sacred cow like return. We have yet to put together
a timeline that incorporates the (>>) changes from MRP.

1) What is the master plan, and where is it documented, even if this
> document is not up to the standard of a proposal? What is the final
> target, and when might we expect it to be reached? What is in the
> pipeline after MRP?
>
> Relatedly, guidance on how to write code now so that it will be
> compatible with future changes helps mitigate the stability issue.
>

The current plans more or less stop with finishing the MonadFail proposal,
getting Semigroup in as a superclass of Monoid, and incorporating some
additional members into Floating. The working document for the timeline
going forward is available here:

https://ghc.haskell.org/trac/ghc/wiki/Status/BaseLibrary

>
> 2) How can I write code that makes use of the Prelude so that it will
> work with every new GHC release over the next 3 years? 5 years? For
> example, how can I write a Monad instance now, knowing the changes that
> are coming, so that the instance will work with every new GHC release
> for the next 3 years? 5 years? If the answer is "you can't," then when
> might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to
> say I don't know the answer!
>

We have a backwards facing "3 release policy" that says it should always be
possible to write code that works backwards for 3 releases. This means that
changes like moving fail out of Monad will take 5 years. However,
maintaining both that and a _forward facing_ 3 release policy would mean
that any change that introduced a superclass would take something like 9
years of intermediate states that make no sense to complete. *9 years to
move one method.*

Now looking forward. You can write code today with 7.10 that will work
without warnings until 8.2. That happens to be 3 releases. In 8.4 you'll
start to get warnings about Semigroup and MonadFail changes, but looking at
it as 3 releases going forward in 8.0 you can just write the instances and
your code would be warning free forward for 3 releases. In 8.6 those
changes go into effect, but you will have been able to make the code
changes that you need to accomodate 8.6 since 8.0.

The current roadmap happens to give you a 3 year sliding window.

Finally, if none of these changes broke Prelude backwards compatibility,
> far fewer people would be complaining :)


If none of our changes were ever able to break Prelude backwards
compatibility the same people who have been complaining about the utter
lack of progress for the previous 17 years and that nearly exploded the
community 2 years ago would be complaining, and based on polling and
discusssions that is actually a much larger group. The AMP passed nearly
unanimously.


> Of course, we can't always make
> progress without breaking things, but a more deliberative process might
> offer an opportunity to make progress while still preserving backwards
> compatibility. Take AMP for example. There were at least two [3] [4]
> proposals for preserving backwards compatibility. Investigating them
> would have taken time and delayed AMP, yes, but why the rush?
>

We've been talking about various superclass defaulting proposals for the
b

Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Mario Blažević

On 15-10-22 09:29 AM, Geoffrey Mainland wrote:

...

1) What is the master plan, and where is it documented, even if this
document is not up to the standard of a proposal? What is the final
target, and when might we expect it to be reached? What is in the
pipeline after MRP?

Relatedly, guidance on how to write code now so that it will be
compatible with future changes helps mitigate the stability issue.


	I have been fully in favour of all the proposals implemented so far, 
and I think that having an explicit master plan would be a great idea. 
It would address some of the process-related objections that have been 
raised, and it would provide a fixed long-term target that would be much 
easier to make the whole community aware of and contribute to.


	For that purpose, the master plan should be advertised directly on the 
front page of haskell.org. Once we have it settled and agreed, the 
purpose of the base-library commitee would essentially become to figure 
out the details like the timeline and code migration path. One thing 
they wouldn't need to worry about is whether anybody disagrees with 
their goals.




2) How can I write code that makes use of the Prelude so that it will
work with every new GHC release over the next 3 years? 5 years? For
example, how can I write a Monad instance now, knowing the changes that
are coming, so that the instance will work with every new GHC release
for the next 3 years? 5 years? If the answer is "you can't," then when
might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to
say I don't know the answer!


	From the discussions so far it appears that the answer for 3 years (or 
at least the next 3 GHC releases) would be to write the code that works 
with the current GHC and base, but this policy has not been codified 
anywhere yet. Knowing the upcoming changes doesn't help with making your 
code any more robust, and I think that's a shame. We could have a 
two-pronged policy:


- code that works and compiles with the latest GHC with no *warnings* 
will continue to work and compile with no *errors* with the following 2 
releases, and
- code that also follows the forward-compatibility recommendations 
current for that version of GHC will continue to work and compile with 
no *errors* with the following 4 releases.


	The forward-compatibility recommendations would become a part of the 
online GHC documentation so nobody complains they didn't know about 
them. Personally, I'd prefer if the recommendations were built into the 
compiler itself as a new class of warnings, but then (a) some people 
would insist on turning them on together with -Werror and then complain 
when their builds break and (b) this would increase the pressure on GHC 
implementors.




Finally, if none of these changes broke Prelude backwards compatibility,
far fewer people would be complaining :) Of course, we can't always make
progress without breaking things, but a more deliberative process might
offer an opportunity to make progress while still preserving backwards
compatibility. Take AMP for example. There were at least two [3] [4]
proposals for preserving backwards compatibility. Investigating them
would have taken time and delayed AMP, yes, but why the rush?


Because they have been investigated for years with no effect.



3) Can we have a process that allows more deliberation over, and wider
publicity for, changes that break backwards compatibility? The goal of
such a process would not be to prevent change, but to allow more time to
find possible solution to the issue of backwards compatibility.


I doubt we can, but this question has already been answered by others.

___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Geoffrey Mainland
On 10/22/2015 11:02 AM, Matthias Hörmann wrote:
> I would say that the need to import Control.Applicative in virtually
> every module manually
> definitely caused some pain before AMP.

In this particular case, there is a trade off between breaking code on
the one hand and having to write some import statements on the other. I
find writing some extra imports less painful than breaking (other
people's and my) code, but the other position is defensible as well. I
sense that I am in the minority, at least on the libraries list.

> I would also argue that a
> non-negligible amount
> of effort goes into teaching the warts, the reasons for the warts and
> how to work around them.

Which wart(s) in particular? All of them? Does having return (and (>>))
in Monad make teaching more difficult?

I teach Haskell beginners, and I found that AMP made explaining monads
slightly more difficult because it served as a source of confusion for
my students.

On the other hand, the warts provide a teachable moment once students
understand all this stuff :)

>> Dealing with AMP? I'm working on a collaborative research project that is 
>> stuck on 7.8 because of AMP.
> I am curious what exactly about AMP causes your research project to be
> "stuck" on GHC 7.8
> considering we have had multiple people mention how little effort it
> took to update even large codebases.
> I think it would be useful information to have to plan future changes
> in a way that might avoid
> your issues.

I was hoping that mentioning this wouldn't distract from the three main
(numbered) questions I posed below. Alas.

If I were working alone, AMP wouldn't be a huge deal. I could fix the
code for 7.10 compatibility, but then unless everyone switches to 7.10,
changes to the codebase made by someone using 7.8, e.g., defining a new
Monad instance, could break things on 7.10 again. It's easier to stick
with 7.8. Any time spent dealing with compatibility issues is time not
spent writing actual code.

I outlined one possible path to avoid this kind of issue: spend more
time thinking about ways to maintain compatibility. We had proposals for
doing this with AMP.

Cheers,
Geoff

>
> On Thu, Oct 22, 2015 at 3:29 PM, Geoffrey Mainland  
> wrote:
>> On 10/22/2015 02:40 AM, Edward Kmett wrote:
>>> On Wed, Oct 21, 2015 at 8:42 PM, Gregory Collins
>>> mailto:g...@gregorycollins.net>> wrote:
>>>
>>>
>>> On Wed, Oct 21, 2015 at 3:18 PM, Geoffrey Mainland
>>> mailto:mainl...@apeiron.net>> wrote:
>>>
>>> My original email stated my underlying concern: we are losing
>>> valuable
>>> members of the community not because of the technical
>>> decisions that are
>>> being made, but because of the process by which they are being
>>> made.
>>>
>>> [If] you're doing research you're on the treadmill, almost by
>>> definition, and you're delighted that we're finally making some
>>> rapid progress on fixing up some of the longstanding warts.
>>>
>>> If you're a practitioner, you are interested in using Haskell for,
>>> y'know, writing programs. You're probably in one of two camps:
>>> you're in "green field" mode writing a lot of new code (early
>>> stage startups, prototype work, etc), or you're
>>> maintaining/extending programs you've already written that are out
>>> "in the field" for you doing useful work. Laura Wingerd calls this
>>> the "annealing temperature" of software, and I think this is a
>>> nice metaphor to describe it. How tolerant you are of ecosystem
>>> churn depends on what your temperature is: and I think it should
>>> be obvious to everyone that Haskell having "success" for
>>> programming work would mean that lots of useful and correct
>>> programs get written, so everyone who is in the former camp will
>>> cool over time to join the latter.
>>>
>>>
>>> I've made the point before and I don't really want to belabor it:
>>> our de facto collective posture towards breaking stuff, especially
>>> in the past few years, has been extremely permissive, and this
>>> alienates people who are maintaining working programs.
>>>
>>>
>>> Even among people who purported to be teaching Haskell or using
>>> Haskell today in industry the margin of preference for the concrete
>>> FTP proposal was ~79%. This was considerably higher than I expected in
>>> two senses. One: there were a lot more people who claimed to be in one
>>> of those two roles than I expected by far, and two: their appetite for
>>> change was higher than I expected. I initially expected to see a
>>> stronger "academic vs. industry" split in the poll, but the groups
>>> were only distinguishable by a few percentage point delta, so while I
>>> expected roughly the end percentage of the poll, based on the year
>>> prior I'd spent running around the planet to user group meetings and
>>> the like, I expected it mostly because I expected more hobbyists and
>>> less support among indus

Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Geoffrey Mainland
On 10/22/2015 02:40 AM, Edward Kmett wrote:
> On Wed, Oct 21, 2015 at 8:42 PM, Gregory Collins
> mailto:g...@gregorycollins.net>> wrote:
>
>
> On Wed, Oct 21, 2015 at 3:18 PM, Geoffrey Mainland
> mailto:mainl...@apeiron.net>> wrote:
>
> My original email stated my underlying concern: we are losing
> valuable
> members of the community not because of the technical
> decisions that are
> being made, but because of the process by which they are being
> made.
>
> [If] you're doing research you're on the treadmill, almost by
> definition, and you're delighted that we're finally making some
> rapid progress on fixing up some of the longstanding warts.
>
> If you're a practitioner, you are interested in using Haskell for,
> y'know, writing programs. You're probably in one of two camps:
> you're in "green field" mode writing a lot of new code (early
> stage startups, prototype work, etc), or you're
> maintaining/extending programs you've already written that are out
> "in the field" for you doing useful work. Laura Wingerd calls this
> the "annealing temperature" of software, and I think this is a
> nice metaphor to describe it. How tolerant you are of ecosystem
> churn depends on what your temperature is: and I think it should
> be obvious to everyone that Haskell having "success" for
> programming work would mean that lots of useful and correct
> programs get written, so everyone who is in the former camp will
> cool over time to join the latter.
>
>
> I've made the point before and I don't really want to belabor it:
> our de facto collective posture towards breaking stuff, especially
> in the past few years, has been extremely permissive, and this
> alienates people who are maintaining working programs.
>
>
> Even among people who purported to be teaching Haskell or using
> Haskell today in industry the margin of preference for the concrete
> FTP proposal was ~79%. This was considerably higher than I expected in
> two senses. One: there were a lot more people who claimed to be in one
> of those two roles than I expected by far, and two: their appetite for
> change was higher than I expected. I initially expected to see a
> stronger "academic vs. industry" split in the poll, but the groups
> were only distinguishable by a few percentage point delta, so while I
> expected roughly the end percentage of the poll, based on the year
> prior I'd spent running around the planet to user group meetings and
> the like, I expected it mostly because I expected more hobbyists and
> less support among industrialists.
>
> I'm actually firmly of the belief that the existing committee
> doesn't really have process issues, and in fact, that often it's
> been pretty careful to minimize the impact of the changes it wants
> to make. As others have pointed out, lots of the churn actually
> comes from platform libraries, which are out of the purview of
> this group.
>
>
> Historically we've had a bit of a split personality on this front.
> Nothing that touches the Prelude had changed in 17 years. On the other
> hand the platform libraries had maintained a pretty heavy rolling wave
> of breakage the entire time I've been around in the community. On a
> more experimental feature front, I've lost count of the number of
> different things we've done to Typeable or template-haskell.
>  
>
> All I'm saying is that if we want to appeal to or cater to working
> software engineers, we have to be a lot less cavalier about
> causing more work for them, and we need to prize stability of the
> core infrastructure more highly. That'd be a broader cultural
> change, and that goes beyond process: it's policy.
>
>
> The way things are shaping up, we've had 17 years of rock solid
> stability, 1 release that incorporated changes that were designed to
> minimize impact, to the point that the majority of the objections
> against them are of the form where people would prefer that we broke
> _more_ code, to get a more sensible state. Going forward, it looks
> like the next 2 GHC releases will have basically nothing affecting the
> Prelude, and there will be another punctuation in the equilibrium
> around 8.4 as the next set of changes kicks in over 8.4 and 8.6 That
> gives 2 years worth of advance notice of pending changes, and a pretty
> strong guarantee from the committee that you should be able to
> maintain code with a 3 release window without running afoul of
> warnings or needing CPP.
>
> So, out of curiosity, what additional stability policy is it that you
> seek?

Thanks to you and Dan [1], I now have a greater understanding and
appreciation for where the committee has been coming from. My new
understanding is that the changes that were formalized in AMP, FTP, and
MRP were the basis for the committee's creation. It also seems that
there are more changes in the pipeline th

Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Taru Karttunen
On 22.10 09:04, Herbert Valerio Riedel wrote:
> Fyi, Alan is currently working on levaraging HaRe[1] in
> 
>  https://github.com/alanz/Hs2010To201x (the `parsing-only` branch)
> 
> and it's already showing great promise. However, tools like this will
> only be able to handle the no-brainer cases, as in general it's a NP
> hard problem. But luckily, those boring mechanical refactorings usually
> represent the vast majority, and that's the tedious work we want
> tooling to assist us most with.

Yes, getting it 99% there as an automated tool would be enough
for most cases.


> > C) Hackage displays vocally what works with which versions of
> > GHC (Status reports do help somewhat)
> 
> 
> I.e. something like
> 
>   http://matrix.hackage.haskell.org/package/text

Yes! Is there a reason that it is not displayed on
http://hackage.haskell.org/package/text which only
displays a link to Status of a 7.8.3 build?

How many percent of Hackage is built with matrix.h.h.o
and is there a plan to integrate it into Hackage pages?

- Taru Karttunen
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime


Re: Breaking Changes and Long Term Support Haskell

2015-10-22 Thread Herbert Valerio Riedel
On 2015-10-22 at 08:04:10 +0200, Taru Karttunen wrote:

[...]

> B) There is an automated tool that can be used to fix most code
> to compile with new versions of GHC without warnings or CPP.

Fyi, Alan is currently working on levaraging HaRe[1] in

 https://github.com/alanz/Hs2010To201x (the `parsing-only` branch)

and it's already showing great promise. However, tools like this will
only be able to handle the no-brainer cases, as in general it's a NP
hard problem. But luckily, those boring mechanical refactorings usually
represent the vast majority, and that's the tedious work we want
tooling to assist us most with.


> C) Hackage displays vocally what works with which versions of
> GHC (Status reports do help somewhat)


I.e. something like

  http://matrix.hackage.haskell.org/package/text

? :-)


 [1]: Btw, here's a recent talk which also mentions the use-case of using
  HaRe to update between Haskell Report revisions or `base` versions:
 
  
https://skillsmatter.com/skillscasts/6539-a-new-foundation-for-refactoring-ghc-exactprint
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-prime