(sorry for the late answer, I was traveling and at an event since Thursday)
Le 12/11/2012 02:46, Leo Meyerovich a écrit :
I wasn't aware of this and then read through about a dozen WebAPIs [2] between
yesterday and today and... discovered it's the case. In my opinion, one of the
most absurd example is the DOMRequest thing which looks like:
{
readonly attribute DOMString readyState; // "processing" or "done"
readonly attribute DOMError? error;
attribute EventListener onsuccess;
attribute EventListener onerror;
attribute readonly any? result;
};
Read it carefully and you'll realize this is actually a promise... but it has
this absurd thing that it has to have both an error and result field while only
one is actually field at any given point.
Also, these APIs and JavaScript as it they are won't support promise
chainability and the excellent error forwarding that comes with it
off-the-shelf. Also, the lack of a built-in Q.all really doesn't promote good
code when it comes to event synchronization.
Oh yes, of course, you can always build a promise library on top of the current
APIs, blablabla... and waste battery with these libraries [3].
Reading the thread again, is this really the primary motivation for adding promises? I
don't see how the energy issue gets noticeably addressed, and so it sounds like, indeed,
"you can always build a promise library on top of the current APIs."
Interestingly enough, if we had built-in promises, it'd be still
possible to add the current DOMRequest as a library if people wanted to.
The underlying question is how to choose which should be the fundamental
brick and which should be the library.
Historically, the choice was made following this schema:
* Some browser ships an feature with a given API
* Other browsers are forced the implement the same thing because soon
enough, there is content relying on the given API.
Depending on the feature, there is variable amount of discussion among
browsers and standard bodies before/between/after the 2 steps.
We know where this had led people writing JS applications: they write
libraries to have other APIs. It wastes everyone's bandwidth (has a
study been done of how much bandwidth is used per day just by jQuery
downloads across the planet? It'd be fun to know) and has a runtime cost
as well.
Since 2010-ish, JS devs fight back and improve APIs at the standard
level where there is a worthwhile opportunity (I'm thinking about DOM4's
Element.prototype.remove for instance).
TC39 is also eager to have more and more feedback from developers before
freezing the spec.
Now, JS devs (well... and academia and a lot of other actors) didn't
wait to find solutions to improve the async programming story. Promises
is one outcome. As Alex pointed out in his response, promises have been
adopted in a good share of JS client-side libraries and feel to have a
good support from the JS dev community, somehow proving that promises
are tha favored API to deal with async programming.
I feel there is a small window of opportunity to get promises as a
standard features, so that's my motivation to talk about it.
There may be real expressive value to a serious promise proposal, however. It
can enable identifying asynchronous APIs and then 1) allowing frameworks to
introspect on them and 2) clarifying the succeed/fail protocol. Doing so with
standard promises seems awkward and insufficient, (...)
I don't understand the value of identifying async APIs at runtime.
Wrapping a predefined set of functions seems like a tedious, but
workable solution. Not wrapping because there is an already convenient
API would be my favorite solution :-)
Hopefully there is more discussion going on in backchannels here..
I don't understand this part. The definition I have read of
"backchannel" didn't really help. (I'm not a native English speaker)
David
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss