Re: Heads-up: SHA1 deprecation (for newly issued certs) causes trouble with local ssl-proxy mitm spyware
Den 04-01-2016 kl. 19:45 skrev Daniel Holbert: On 01/04/2016 10:33 AM, Josh Matthews wrote: Wouldn't the SSL cert failures also prevent submitting the telemetry payload to Mozilla's servers? Hmm... actually, I'll bet the cert errors will prevent Firefox updates, for that matter! (I'm assuming the update-check is performed over HTTPS.) If I remember correctly, update checks are pinned to a specific CA, so updates for users with software that MITM AUS would already be broken? ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Maintaining the en-US dictionary that ships with Mozilla products
Den 03-01-2016 kl. 13:05 skrev Jörg Knobloch: On 2/01/2016 12:37, Pascal Chevrel wrote: Le 02/01/2016 12:07, Jörg Knobloch a écrit : It is very unfortunate that this add-on maintained by "jooliaan" is so badly out of date. I don't know how to contact the author. I suggest that he synchronise the add-on with the Mozilla maintained en-US dictionary once this has been improved, see below. AFAIK, jooliaan (Giuliano Masseroni) is no longer contributing to the Mozilla project, he was part of the Italian volunteer community. Pascal I believe that a "rich" up-to-date US English dictionary should be provided to Mozilla users. I have therefore published the current large SCOWL dictionary as an add-on: https://addons.mozilla.org/en-US/firefox/addon/us-english-dictionary/ I intend to refresh this add-on as new SCOWL versions become available. Jorg K. Creating a second add-on with a different extension ID will not fix things, only make them worse. Now users have two en-us dictionaries to choose from, with no information telling which one is better. All existing users are stranded on the old version. And from the description of your new add-on, it seems it is not identical to the one shipped with en-US Firefox, so users of localized Firefox still don't have that dictionary available. Mozilla should officially maintain the en-US dictionary on https://addons.mozilla.org/en-US/firefox/language-tools/ , like Mozilla officially maintains the language packs. ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Maintaining the en-US dictionary that ships with Mozilla products
Den 03-01-2016 kl. 17:02 skrev Jörg Knobloch: As the only dictionary maintained by Mozilla, Mozilla's en-US dictionary is a special case. I don't think it is that special. Some Firefox locales other than en-US ship with built in dictionaries. For those, the add-on could be derived from the source of the Firefox locale. I maintain the Danish dictionary add-on on AMO. Whenever upstram releases a new version, I commit it to the Firefox localization source, and from there I have a script to generate an identical add-on for AMO: http://hg.mozilla.org/releases/l10n/mozilla-aurora/da/file/tip/extensions/spellcheck/hunspell/extension.sh ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Maintaining the en-US dictionary that ships with Mozilla products
Den 28-12-2015 kl. 20:31 skrev Jörg Knobloch: Thirdly, the add-on dictionary contains 13% more words than the Mozilla maintained dictionary, While bigger may not be better, I don't see why Mozilla should offer an en-US dictionary for localized Firefox builds that is different than the en-US dictionary for en-US builds. The en-US dictionary for localized Firefox was last updated in March 2013 according to https://addons.mozilla.org/da/firefox/addon/united-states-english-spellche/versions/ But the en-US dictionary for en-US Firefox was updated 61 times since then according to https://hg.mozilla.org/mozilla-central/log/tip/extensions/spellcheck/locales/en-US/hunspell/en-US.dic ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Intent to implement: File system provider API
Den 28-07-2015 kl. 02:05 skrev Jonas Sicking: I don't think registerProtocolHandler has nearly enough API surface to support what's needed here. But if you mean copying the idea that there's a function call which displays a UI prompt which allows the user to accept/deny then that could work. I don't think anyone should try to copy the registration UX of registerProtocolHandler (or Web Intents or Web Activities). None of these technologies have gained any popularity, and I suspect that may be because the registration UX is bad. It prompts the user about something technical the user may not understand at a time the user don't need it. - Jesper Kristensen ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Intent to implement and ship: document.execCommand(cut/copy)
Den 26-05-2015 kl. 13:13 skrev Ted Mielczarek: Additionally, the 'paste' event from the spec already works, which seems like it provides pretty useful functionality for webapps. The user can use Ctrl+V or Edit-Paste or whatever existing UI mechanism the browser has to trigger a paste, and the page can handle the event to do something useful with the pasted data. The same can be said about cut and copy. I don't think it will work for my use case. Here is my use case: I have a web application that allows the user to upload tabular data. Users most often prepare the data in Excel or other spreadsheet applications before uploading the data. My application supports a CSV file upload, but it is slow to use, since you have to copy the section you which to upload into a separate spreadsheet, save it as a file, go into the web application and find the same file, upload it, and then delete the temporary file. You also get issues with character encoding when saving as CSV from Excel. As a quicker solution the web application allows the user to copy-paste the relevant section of their excel file into the web application, which takes much fewer steps. However users often need to paste large amounts of data (millions of cells), which will make the browser freeze while it tries to render the pasted text in the textarea. It would be much faster if I could just present a Paste data or Upload from clipboard button, which could load the data from the clipboard directly into a JavaScript string without first trying to render it. I feel that doing this by handling the paste event on a text area would be a confusing user experience, since you would have a visible text area that would not work when you type into it, and it looks like you would see your text when you paste, but you won't. So I believe I need to trigger the paste event when the user clicks a button in order to do this. /Jesper Kristensen ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Intent to implement and ship: document.execCommand(cut/copy)
Very nice, I am looking very much forward to using this. It would be nice of you could also support paste. I agree that it is more sensitive, so maybe you could go with a user prompt in that case? The prompts as implemented in IE are horrible, but I think there could be many better ways of doing it. Here is one way I have thought of how a prompt could look like, since you only allow it in relation to user interaction, you could make a prompt that looks like a context menu for the element the user interacted with, using a single word describing the action. I have implemented a mockup at http://jsfiddle.net/vvjcgj5g/1/ but I am sure Mozilla UX people could come up with better ways to do this. (My mockup has a prompt for all three actions, but you could do it for paste only) /Jesper Kristensen Den 05-05-2015 kl. 23:51 skrev Ehsan Akhgari: Summary: We currently disallow programmatic copying and cutting from JS for Web content, which has relied on web sites to rely on Flash in order to copy content to the clipboard. We are planning to relax this restriction to allow this when execCommand is called in response to a user event. This restriction mimics what we do for other APIs, such as FullScreen. Bug: https://bugzilla.mozilla.org/show_bug.cgi?id=1012662 Link to standard: This is unfortunately not specified very precisely. There is a rough spec here: https://dvcs.w3.org/hg/editing/raw-file/tip/editing.html#miscellaneous-commands and the handling of clipboard events is specified here: https://w3c.github.io/clipboard-apis/. Sadly, the editing spec is not actively edited. We will strive for cross browser interoperability, of course. Platform coverage: All platforms. Target release: Firefox 40. Preference behind which this will be implemented: This won't be hidden behind a preference, as the code changes required are not big, and can be easily reverted. DevTools bug: N/A Do other browser engines implement this: IE 10 and Chrome 43 both implement this. Opera has adopted this from Blink as of version 29. Security Privacy Concerns: We have discussed this rather extensively before: http://bit.ly/1zynBg7, and have decided that restricting these functions to only work in response to user events is enough to prevent abuse here. Note that we are not going to enable the paste command which would give applications access to the contents of the clipboard. Web designer / developer use-cases: This feature has been rather popular among web sites. Sites such as Github currently use Flash in order to allow people to copy text to the clipboard by clicking a button in their UI. Cheers, ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Intent to deprecate: persistent permissions over HTTP
Den 06-03-2015 kl. 19:04 skrev Gijs Kruitbosch: On 06/03/2015 17:27, Anne van Kesteren wrote: A large number of permissions we currently allow users to store persistently for a given origin. I suggest we stop offering that functionality when there's no lock in the address bar. Can we make an exception for localhost and its IPv4 and IPv6 equivalents to make things easier for web devs? Bonus points if we make a mechanism that detects /etc/host overrides (to localhost) and allow it there, too. ~ Gijs If there is such exception, please make it opt-in. It is annoying for web devs when localhost behaves differently than any other site. It is much harder and time consuming to debug something when it works locally and then breaks when you deploy it. Web devs who deploy to https should develop locally on https. Jesper Kristensen ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Restricting gUM to authenticated origins only
Den 08-09-2014 kl. 18:58 skrev Martin Thomson: On 07/09/14 07:09, Jesper Kristensen wrote: Cookies are segregated by http vs https, right? No, unfortunately they are not. Numerous attempts at fixing it has been rejected by browser vendors. For example http://tools.ietf.org/html/draft-abarth-cake-01 They are, somewhat. All cookies are available to an https origin, but some are restricted so that http origins can't see them. https://tools.ietf.org/html/rfc6265#section-5.4 * If the cookie's secure-only-flag is true, then the request- uri's scheme must denote a secure protocol (as defined by the user agent). Yes, the abstract in the linked spec draft clearly states this: You can establish cookie confidentiality using the Secure flag, but it is not possible today to establish cookie integrity. ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Intent to implement: Prerendering API
This templated prerendereing sounds like a complicated API. Is there any advantage of this over what is possible today with a single page application using history.pushState? Den 12-08-2014 kl. 00:03 skrev Jonas Sicking: Very exited to see this happening. Implementation issues aside, I have two comments: * This is something that we really need on FirefoxOS. I hope that the implementation strategy will work there too? * A use-case that we came upon pretty quickly when we were looking at using prerendering in FirefoxOS is that we might not know the exact URL that the user is likely to navigate to, but we have a pretty good guess about what template page it's going to be. Consider for example bugzilla. After the user has done a search query, they are likely to click on one of the bug links. Each bug has a different URL, but all bugs share much of the page UI. It would be really awesome if bugzilla could ask to prerender a URL like https://bugzilla.mozilla.org/show_bug.cgi?justLoadTemplate=true;. Then when the user clicks a bugzilla URL like https://bugzilla.mozilla.org/show_bug.cgi?id=123456;, we would enable webpage logic to somehow force the prerendered page to be used, even though the URLs don't match. One way to do this would be to enable some way for the current page to talk to the prerendered page. The current page could then tell the prerendered page the user just clicked bug 123456, at which point the prerendered page could use replaceState to change its URL to https://bugzilla.mozilla.org/show_bug.cgi?id=123456;, at which point we would see that the URLs matched. Another solution would be to enable the prerendered page to say i'm able to act as a prerender page for any URLs that match pattern https://bugzilla.mozilla.org/show_bug.cgi?id=*;. Would be great if someone could bring up this use case with the prerender editors and make sure it gets covered and that a specced solution is defined. / Jonas On Mon, Aug 11, 2014 at 11:47 AM, Roshan Vidyashankar roshan...@gmail.com wrote: Summary: The Prerendering API allows the rendering engine to start loading and rendering a web page without any visible UI until the page needs to be displayed to the user. It's a gecko API that consumers can choose to use. For now, we're not talking about exposing this to web pages yet, or even using it in any of our products, we're just working on the implementation. Bug: https://bugzilla.mozilla.org/show_bug.cgi?id=730101 Link to standard: At some point we are going to be contributing to a spec but there are a number of unknown things which we're hoping to find out through the implementation - like what APIs need to be blacklisted in prerendering (audio, window.open etc). Chrome and IE have shipped their own implementations of prerendering. Platform coverage: All Estimated or target release: Unknown Preference behind which this will be implemented: dom.prerendering.enabled Implementation Plan There are two big parts to the implementation. 1. How do we handle the prerendering itself in a way that works both for desktop and b2g For this, we're thinking about adding a boolean flag to nsDocShell which together with mIsActive will constitute a tri-state: active, inactive and prerendered. As far as the outside code is concerned, prerendered docshells are treated as inactive ones for the most part. This means that we'd basically need to not modify nsDocShell::GetIsActive, and audit all of its callers and find the potential consumers who would want to treat prerendered and inactive differently (I don't expect there to be any such consumers actually but we'll see). We'd use the prerendered state to reflect the correct status reflected through the page visibility API. The prerendered docshell masquerading as inactive will also mean that it won't get a rendering on b2g for example because that's how we hide mozbrowser iframes there. In order for gaia and XUL based consumers to be able to control the prerendering, I think we'd need to use the nsIFrameLoader swapping facilities that we currently have for the xul:browser swapDocShells method and extend those to make them usable with the mozbrowser frameLoader as well. That way we can expose the prerendering API through both mozbrowser and xul:browser while sharing hopefully most of the code between these different consumers, and we'd get the ability to swap the frame loaders when we need to actually render a prerendered document. 2. What do we do with the APIs that have undesirable side-effects in prerendered pages (audio, window.open etc). We first need to decide what we're going to do when a page calls into an API which has undesired side effects (and that is a blurry line -- I hope as part of this we come up with an actual list of what those actions are!). Chrome basically aborts the execution of scripts on the page and throws out the prerendered content as soon as any such APIs are called. IE's documentation claims that they
Re: Use of instanceof SomeDOMInterface in chrome and extensions
Den 29-12-2012 20:19smaug skrev: On 12/27/2012 12:18 PM, Boris Zbarsky wrote: We have a bunch of chrome and extension code that does things like instanceof HTMLAnchorElement (and likewise with other DOM interfaces). The problem is that per WebIDL spec and general ECMAScript sanity this shouldn't work: instanceof goes up the proto chain looking for the thing on the right as a constructor, and chrome's HTMLAnchorElement is not on the proto chain of web page elements. The arguably right way to do the el instanceof HTMLAnchorElement test is: el instanceof el.ownerDocument.defaultView.HTMLAnchorElement Needless to say this sucks. And doesn't work for data documents which don't have defaultView Wouldn't this work? (for chrome code) el instanceof Components.utils.getGlobalForObject(el).HTMLAnchorElement ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform
Re: Removing make targets for running tests?
Den 08-10-2012 21:05Gregory Szorc skrev: We now have a tool in mozilla-central that has a much better UX for running tests (mach). It's not perfect yet, but it's getting there (please write patches!). The build peers (or at least a few of us) really don't like the make targets to run tests because they are awkward, both to maintain and for people to use. mach and having Python drive everything offer compelling advantages over simple make targets. The make targets for running tests aren't used by buildbot (well, at least a lot of them - there might be a straggler or two). Putting this all together, the stage is set to remove the make targets for running tests. I'm writing this post to see what obstacles/resistance there are to removing the make targets for running tests. Obviously a prerequisite is having mach reach feature parity with the make targets. What other concerns are there? Before you deprecate the make targets, it would be nice if mach actually works and there is documentation for it. I have so far not been able to figure out how to run a test with mach. This is what I tried: $ ./mach xpcshell-test toolkit/mozapps/extensions/test/xpcshell/test_dictionary .js Traceback (most recent call last): File ./mach, line 47, in module sys.exit(mach.run(sys.argv[1:])) File c:\Users\Jesper\Desktop\web\mozilla\build\mozilla-central\src\python/mac h\mach\main.py, line 147, in run return self._run(argv) File c:\Users\Jesper\Desktop\web\mozilla\build\mozilla-central\src\python/mac h\mach\main.py, line 206, in _run result = fn(**stripped) File c:\Users\Jesper\Desktop\web\mozilla\build\mozilla-central\src\python/moz build\mach\commands\testing.py, line 89, in run_xpcshell_test xpcshell.run_test(**params) File c:\Users\Jesper\Desktop\web\mozilla\build\mozilla-central\src\python/moz build\mozbuild\testing\xpcshell.py, line 49, in run_test self._run_xpcshell_harness(**args) File c:\Users\Jesper\Desktop\web\mozilla\build\mozilla-central\src\python/moz build\mozbuild\testing\xpcshell.py, line 89, in _run_xpcshell_harness xpcshell.runTests(**args) File c:\Users\Jesper\Desktop\web\mozilla\build\mozilla-central\src\testing/xp cshell\runxpcshelltests.py, line 736, in runTests test[here]) Exception: testsRootDir is not a parent path of c:\Users\Jesper\Desktop\web\mozi lla\build\mozilla-central\src\obj-i686-pc-mingw32\_tests\xpcshell\toolkit\mozapp s\extensions\test\xpcshell ___ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform