Re: Proposal: stop exporting JS symbols

2013-09-23 Thread Philipp Kewisch

On 9/23/13 8:20 PM, Benjamin Smedberg wrote:

On 9/23/2013 8:45 AM, Philipp Kewisch wrote:



Initially it seems it would be easy to replace calDateTime with a JS
component and I had started to do this, but unfortunately calDateTime
is instanciated directly (via constructor, not via xpcom) in a few
locations in our C code, so replacing only calDateTime with a JS
implementation wouldn't really work out well.

Maybe it's possible to replace the C++ constructor call with an XPCOM
service call which wraps?


So to get/set a jsdate from a js caller, there is a call from JS -> C++ 
-> JS that returns a value that traverses back through that path? I 
guess it would be possible, I don't know how much this will affect 
performance as its used quite often.


If not exporting JS symbols is the way to go, its at least good to know 
there is a workaround.


Philipp
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: C++ Standards Committee meeting next week

2013-09-23 Thread Joshua Cranmer 🐧

On 9/20/2013 4:50 PM, Botond Ballo wrote:

Hi everyone,

The C++ Standards Committee is meeting in Chicago next week. Their focus will 
be on C++14, the upcoming version of the C++ standard, as well as some 
Technical Specifications (specifications for features intended to be 
standardized but not fully-baked enough to be standardized now) that are also 
planned for publication in 2014. Presumably there will also be some discussion 
of the following version of the standard, C++17.

I will attend this meeting as an observer. I intend to follow the progress of 
the Concepts Lite proposal [1] which I'm particularly interested in, but I will 
try to keep up with other goings-on as well (the committee splits up into 
several sub-groups that meet in parallel over the course of the week).

I wanted to ask if there's anything anyone would like to know about the 
upcoming standards that I could find out at the meeting - if so, please let me 
know and I will do my best to find it out.


I have a laundry list of stuff that I want a fly-on-the-wall perspective.

First is the discussion of the standardization support macros (so we 
don't have to maintain crummy stuff like mfbt/Compiler.h), although that 
meeting may have already passed.


I'm also interested in seeing proposals for standardization of more 
complex attributes that we have static analysis results for--in 
particular, the stack class/heap class/etc. annotations, the 
must_override analysis, and expanding override/must_override to refer to 
nonvirtual as well as virtual methods.


The implementation timeframe of modules is also interesting, since it 
appears to be the best-proposed solution to solving various #include 
problems.


The new threading paradigms (MS is proposing an async/await framework 
that basically plays out like the |yield| keyword in JS or python) could 
also prove useful for Mozilla.


Finally, I'm interested in seeing what APIs are going to come out of the 
networking and filesystem TS groups, particularly because trying to 
track down progress on them is maddening.


--
Joshua Cranmer
Thunderbird and DXR developer
Source code archæologist

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: C++ Standards Committee meeting next week

2013-09-23 Thread Gregory Szorc

On 9/20/13 2:50 PM, Botond Ballo wrote:

Hi everyone,

The C++ Standards Committee is meeting in Chicago next week. Their focus will 
be on C++14, the upcoming version of the C++ standard, as well as some 
Technical Specifications (specifications for features intended to be 
standardized but not fully-baked enough to be standardized now) that are also 
planned for publication in 2014. Presumably there will also be some discussion 
of the following version of the standard, C++17.

I will attend this meeting as an observer. I intend to follow the progress of 
the Concepts Lite proposal [1] which I'm particularly interested in, but I will 
try to keep up with other goings-on as well (the committee splits up into 
several sub-groups that meet in parallel over the course of the week).

I wanted to ask if there's anything anyone would like to know about the 
upcoming standards that I could find out at the meeting - if so, please let me 
know and I will do my best to find it out.

If anyone's interested in the C++ standardization process, you can find more 
information here [2].

Thanks,
Botond

[1] 
http://isocpp.org/blog/2013/02/concepts-lite-constraining-templates-with-predicates-andrew-sutton-bjarne-s
[2] http://isocpp.org/std


Fixing the "include hell" problem would be at the top of my list of 
wants as someone who cares about the performance of a large scale build 
system. I believe there was a C++ modules proposal on the standards 
track at one point. Not sure what it's status is beyond being an 
experimental feature in clang [1]. Of course, by the time all the major 
compilers support this and we're in a position to make use of it, large 
parts of m-c's C++ might be rewritten in Rust, so who knows.


[1] http://clang.llvm.org/docs/Modules.html

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: C++ Standards Committee meeting next week

2013-09-23 Thread Robert O'Callahan
On Tue, Sep 24, 2013 at 5:18 AM, Trevor Saunders wrote:

> - virtual constants
>

Yeah baby!


> - ability to say classes should only be used on stack / heap or not used
>   in one of those
>

That's another good idea!

Rob
-- 
Jtehsauts  tshaei dS,o n" Wohfy  Mdaon  yhoaus  eanuttehrotraiitny  eovni
le atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o  Whhei csha iids  teoa
stiheer :p atroa lsyazye,d  'mYaonu,r  "sGients  uapr,e  tfaokreg iyvoeunr,
'm aotr  atnod  sgaoy ,h o'mGee.t"  uTph eann dt hwea lmka'n?  gBoutt  uIp
waanndt  wyeonut  thoo mken.o w  *
*
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Brian Smith
On Mon, Sep 23, 2013 at 3:40 PM, Chris Peterson wrote:

> On 9/23/13 2:41 PM, Benjamin Smedberg wrote:
> Even if Firefox supported the Pepper API, we would still need a Pepper
> version of Flash. And Adobe doesn't have one; Google does.
>
> When I was an engineer on Adobe's Flash Player team, Google did all
> development and builds of Flash for Pepper. Adobe just verified that
> Google's builds pass a certification test suite.
>

Just to re-iterate: I am not saying we should/must do a Pepper Flash Player
in Firefox. I am not particularly for or against it.

However, I will say that the people at Google that worked on Chromium's
sandboxing and Pepper have already reached out to us to help us with
sandboxing. We shouldn't assume that they wouldn't help us with the Pepper
Flash player without asking them. It might actually be easier to secure
help from Google than from Adobe.

Cheers,
Brian


> __
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/**listinfo/dev-platform
>



-- 
Mozilla Networking/Crypto/Security (Necko/NSS/PSM)
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Brian Smith
On Mon, Sep 23, 2013 at 2:41 PM, Benjamin Smedberg wrote:

> On 9/23/2013 4:59 PM, Brian Smith wrote:
>
>> Given that Pepper presents little benefit to users,
>>>
>>
>> Pepper presents a huge benefit to users because it allows the browser to
>> sandbox the plugin. Once we have a sandbox in Firefox, NPAPI plugins will
>> be the security weak spot in Firefox.
>>
> You're making some assumptions here:
>
> * That "the plugin" is only Flash. No other plugin has Pepper or is likely
> to use pepper. And a significant number of users are still using non-Flash
> plugins.
>

I am making the assumption for now that Flash is the main thing we don't
have a solution for.


> * That we could have a pepper Flash for Firefox in a reasonable timeframe
> (highly unlikely given the engineering costs of Pepper).
>

I am not making this assumption. I am not saying "we should/must do
Pepper." I am saying that it isn't right to say there is "little benefit"
to Pepper. Even with Flash being the "only" Pepper plugin, the (potential)
security advantages of Pepper make it very valuable.


> * That Flash is the primary plugin attack vector we should protect
> against. We know *out of date* Flash is an attack vector, but our security
> blocking already aims to protect that segment of the population. Up-to-date
> Flash does not appear to be highly dangerous.


Vulnerabilities are dangerous even when we don't know about them. And, even
when we do know about them, they are dangerous until the user can update to
a version without the vulnerability. My understanding is that if there were
a zero-day exploit in the Flash plugin, and Adobe took a week to ship a
fix, then all of our users would be vulnerable to that zero-day
vulnerability for a week or more.



>  We need a story and a timeline for securing plugins. Click-to-play was a
>> great start, but it is not enough.
>>
>> If our story for securing plugins is to
>> drop support for them then we should develop the plan with a timeline for
>> that.
>>
>
> What is your definition of "enough"? With the change to mark plugins as
> click-to-play by default, they will be at least as secure as Firefox
> extensions, and less attack surface.
>

Like I said, the click-to-play change is a huge improvement. I can't
emphasize that enough. We don't have a sandbox for Firefox itself yet, so
now is not the time to be super critical of potential weaknesses in Adobe's
sandbox for Flash to argue that the exception for Flash is unreasonable. I
think everybody should feel good with the progress here.

These are all longer-term items, some of which are still research-y. I
> don't think it's either possible or necessary to "develop a plan with a
> timeline" in our current situation.
>

I don't think we necessarily need a detailed timeline for killing plugins
completely. I agree it would likely be impractical to create one even if we
tried.

But, we should be able to create and share plans for what we can accomplish
regarding improving things with respect to plugins in the next year, at
least. For example, in your earlier comments, you said that it didn't seem
realistic to kill NPAPI plugins by the end of 2014. I suppose that
includes, in particular, Flash. I agree with you, though I think there are
some people at Mozilla that disagree. Either way, it seems like we should
develop a more concrete plan for dealing with Flash security issues, at
least, for 2014--e.g. creating a plan to make click-to-play for Flash in
the event of a zero-day in the Flash player a viable alternative. I would
be happy to help create such a plan.

Also, several internal systems within Mozilla Corporation are Flash-based,
including our company-wide videoconferencing system and parts of our
payroll system (IIUC). I think it would be great if we developed a plan for
Mozilla Corporation to be able to dogfood a Flash-player-free Firefox
internally by the end of 2014, at least.

Cheers,
Brian
-- 
Mozilla Networking/Crypto/Security (Necko/NSS/PSM)
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Chris Peterson

On 9/23/13 2:41 PM, Benjamin Smedberg wrote:

* That "the plugin" is only Flash. No other plugin has Pepper or is
likely to use pepper. And a significant number of users are still using
non-Flash plugins.
* That we could have a pepper Flash for Firefox in a reasonable
timeframe (highly unlikely given the engineering costs of Pepper).


Even if Firefox supported the Pepper API, we would still need a Pepper 
version of Flash. And Adobe doesn't have one; Google does.


When I was an engineer on Adobe's Flash Player team, Google did all 
development and builds of Flash for Pepper. Adobe just verified that 
Google's builds pass a certification test suite.



cp
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Benjamin Smedberg

On 9/23/2013 4:59 PM, Brian Smith wrote:

Given that Pepper presents little benefit to users,


Pepper presents a huge benefit to users because it allows the browser to
sandbox the plugin. Once we have a sandbox in Firefox, NPAPI plugins will
be the security weak spot in Firefox.

You're making some assumptions here:

* That "the plugin" is only Flash. No other plugin has Pepper or is 
likely to use pepper. And a significant number of users are still using 
non-Flash plugins.
* That we could have a pepper Flash for Firefox in a reasonable 
timeframe (highly unlikely given the engineering costs of Pepper).
* That Flash is the primary plugin attack vector we should protect 
against. We know *out of date* Flash is an attack vector, but our 
security blocking already aims to protect that segment of the 
population. Up-to-date Flash does not appear to be highly dangerous.



I don't think it makes any sense to focus on it relative to other things
such as graphics performance, web API improvements, and asm.js which can
serve the sam general niche as plugins, but will improve the open web at
the same time.


We need a story and a timeline for securing plugins. Click-to-play was a
great start, but it is not enough.

If our story for securing plugins is to
drop support for them then we should develop the plan with a timeline for
that.


What is your definition of "enough"? With the change to mark plugins as 
click-to-play by default, they will be at least as secure as Firefox 
extensions, and less attack surface.


We all agree that users would be more secure without plugins. But we 
can't make security decisions in a vacuum: some users *use* Java and 
other minority plugins. Turning that off just means that those users are 
stuck with IE or downrev Firefox. What we *can* do is use a combination 
of carrots and sticks to decrease plugin usage to the point where 
removing it won't affect our viability as a product:


* Use the leverage of growing mobile device populations that don't have 
any plugins to incent websites authors to stop using plugins.
* Add the most common features to the web platform which currently 
require plugins. This is well underway with websockets, graphics/gaming 
improvements, and webrtc; I am already working with Andrew Overholt on 
some other web APIs which we've identified as important.

* Replace the Adobe Flash runtime either partly or entirely with shumway.

These are all longer-term items, some of which are still research-y. I 
don't think it's either possible or necessary to "develop a plan with a 
timeline" in our current situation.


--BDS

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Brian Smith
On Mon, Sep 23, 2013 at 1:46 PM, Benjamin Smedberg wrote:

> On 9/23/2013 4:29 PM, Hubert Figuière wrote:
>
>> Hi all,
>>
>> Today Google said they'd drop NPAPI for good.
>>
> We also intend to someday drop NPAPI for good. I don't think that "by the
> end of 2014" is a realistic timeline for either Chrome or us, given the
> number of users who still rely on Java and other plugins.


http://thenextweb.com/google/2013/09/23/google-chrome-drops-netscape-plugin-api-support-to-improve-stability-will-block-most-plugins-in-january-2014/

Note in particular, this quote from that article: "Furthermore, Mozilla
plans to block NPAPI plug-ins in December 2013."

People are asking me about that on Twitter now.

Cheers,
Brian
-- 
Mozilla Networking/Crypto/Security (Necko/NSS/PSM)
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Brian Smith
On Mon, Sep 23, 2013 at 1:46 PM, Benjamin Smedberg wrote:

> The costs of Pepper are huge: it is not a well-specified API; we'd be
> reverse-engineering large bits of chromium code in order to support it, and
> it's clear that we want to focus effort on the web not Pepper.


I asked some Chromium guys how much of the Pepper API the Flash Pepper
plugin used. Their answer was literally "150%." They explained that Flash
player users APIs that are not even in the Pepper "spec."


> Given that Pepper presents little benefit to users,


Pepper presents a huge benefit to users because it allows the browser to
sandbox the plugin. Once we have a sandbox in Firefox, NPAPI plugins will
be the security weak spot in Firefox. Granted, Flash has its own sandbox.
However, I have very little confidence in Flash's sandbox given my
understanding of how Adobe is (barely) maintaining Flash and given that we
are the only major user of that version of Flash.


> I don't think it makes any sense to focus on it relative to other things
> such as graphics performance, web API improvements, and asm.js which can
> serve the sam general niche as plugins, but will improve the open web at
> the same time.
>

We need a story and a timeline for securing plugins. Click-to-play was a
great start, but it is not enough. If our story for securing plugins is to
drop support for them then we should develop the plan with a timeline for
that.

Cheers,
Brian
-- 
Mozilla Networking/Crypto/Security (Necko/NSS/PSM)
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Andreas Gal

Pepper is not an API, its basically a huge set of Chromium guts exposed you can 
link against. The only documentation is the source, and that source keeps 
constantly changing. I don't think its viable for anyone to implement Pepper 
without also pulling in most or all of Chromium. Pepper is Chrome, and Chrome 
is Pepper. This is the reason that we won't-fixed bug 729481, and nothing has 
changed since then. I don't think we should spend energy on getting onto 
Google's Pepper treadmill. We should instead continue to accelerate the decline 
of plugins by offering powerful new HTML5 capabilities that obsolete plugins.

Andreas

On Sep 23, 2013, at 1:29 PM, Hubert Figuière  wrote:

> Hi all,
> 
> Today Google said they'd drop NPAPI for good.
> 
> http://news.cnet.com/8301-1023_3-57604242-93/google-begins-barring-browser-plug-ins-from-chrome/
> 
> Bug 729481 was WONTFIXED a while ago. tl;dr : implement Pepper plugin API
> 
> I think it might be worth the revisit that decision before it is too late.
> 
> 
> Hub
> 
> PS: I truly believe that we should drop plugin support all together, but
> that's not what I'm discussing here.
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Benjamin Smedberg

On 9/23/2013 4:29 PM, Hubert Figuière wrote:

Hi all,

Today Google said they'd drop NPAPI for good.
We also intend to someday drop NPAPI for good. I don't think that "by 
the end of 2014" is a realistic timeline for either Chrome or us, given 
the number of users who still rely on Java and other plugins. But we're 
certainly looking into the places where people currently "need" plugins 
and are trying to create or implement web APIs to address those needs.




http://news.cnet.com/8301-1023_3-57604242-93/google-begins-barring-browser-plug-ins-from-chrome/

Bug 729481 was WONTFIXED a while ago. tl;dr : implement Pepper plugin API

I think it might be worth the revisit that decision before it is too late.
Too late for what? What are you concerned about? We are not constrained 
by Chrome's decision to drop NPAPI. Right now Flash is the only 
significant plugin that is using pepper, and it also has a supported 
NPAPI version. We're also working on a pure-JS replacement (Shumway) 
which is going quite well.


The costs of Pepper are huge: it is not a well-specified API; we'd be 
reverse-engineering large bits of chromium code in order to support it, 
and it's clear that we want to focus effort on the web not Pepper. Given 
that Pepper presents little benefit to users, I don't think it makes any 
sense to focus on it relative to other things such as graphics 
performance, web API improvements, and asm.js which can serve the sam 
general niche as plugins, but will improve the open web at the same time.


--BDS

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Jet Villegas
The only Pepper plug-in worth talking about is the Flash Player. The Flash 
Player that ships in Chrome is developed by Google and distributed with the 
Chrome browser. That is, Adobe doesn't make this Pepper plug-in and has no 
installers for Firefox users to use. In other words, Pepper support doesn't get 
you a Pepper Flash Player in Firefox. We're going in a rather different 
direction:
https://blog.mozilla.org/research/2012/11/12/introducing-the-shumway-open-swf-runtime-project/

--Jet

- Original Message -
From: "Hubert Figuière" 
To: dev-platform@lists.mozilla.org
Sent: Monday, September 23, 2013 1:29:14 PM
Subject: Implementing Pepper since Google is dropping NPAPI for good

Hi all,

Today Google said they'd drop NPAPI for good.

http://news.cnet.com/8301-1023_3-57604242-93/google-begins-barring-browser-plug-ins-from-chrome/

Bug 729481 was WONTFIXED a while ago. tl;dr : implement Pepper plugin API

I think it might be worth the revisit that decision before it is too late.


Hub

PS: I truly believe that we should drop plugin support all together, but
that's not what I'm discussing here.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Implementing Pepper since Google is dropping NPAPI for good

2013-09-23 Thread Hubert Figuière
Hi all,

Today Google said they'd drop NPAPI for good.

http://news.cnet.com/8301-1023_3-57604242-93/google-begins-barring-browser-plug-ins-from-chrome/

Bug 729481 was WONTFIXED a while ago. tl;dr : implement Pepper plugin API

I think it might be worth the revisit that decision before it is too late.


Hub

PS: I truly believe that we should drop plugin support all together, but
that's not what I'm discussing here.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: You want faster builds, don't you?

2013-09-23 Thread Neil

Steve Fink wrote:


My anecdote: about 2 years ago, I did a lot of building on Windows and tried 
very hard to use a VM (just one layer -- a Windows 7 Virtualbox VM inside 
Fedora x64.) The configure times were excruciating. 6 minutes sounds about 
right -- for the top-level configure only. If you add in the recursive 
configures (js/src, libffi, I can't remember what else), it was at least 30 
minutes just in configure.

Just for comparison, I tried this in a Windows 2003 Virtualbox VM using 
2 of my AMD cores inside Fedora x64, using a magnetic disk. The 
configure time for the whole of mozilla-central came in at just under 6 
minutes. (I used to use Windows XP but it was about half the speed 
because I could only use the IDE driver.)


--
Warning: May contain traces of nuts.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Proposal: stop exporting JS symbols

2013-09-23 Thread Ehsan Akhgari

On 2013-09-21 11:18 PM, xunxun wrote:

于 2013/9/20 星期五 22:02, Benjamin Smedberg 写道:

On 9/20/2013 9:23 AM, Mike Hommey wrote:

We're already statically linking js libraries info libxul. Except on
windows, but that's work in progress in bug 915735

I am primarily worried about doing this on Windows.

, although we don't
know yet if it's going to work at all: after dealing with the xpcshell
crash during the instrumentation phase, we'll have to see how much
memory the linker is going to use, and that might not be very good news
ahead.

We will probably have a complete solution for linker memory issues
when VC2013 is released. Let's plan as if that were the case.

x64 cross to x86 toolchain?
Will you merge gkmedias.dll to xul.dll then?


We may do that once we're there.

Ehsan

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Proposal: stop exporting JS symbols

2013-09-23 Thread Benjamin Smedberg

On 9/23/2013 8:45 AM, Philipp Kewisch wrote:



Initially it seems it would be easy to replace calDateTime with a JS 
component and I had started to do this, but unfortunately calDateTime 
is instanciated directly (via constructor, not via xpcom) in a few 
locations in our C code, so replacing only calDateTime with a JS 
implementation wouldn't really work out well.
Maybe it's possible to replace the C++ constructor call with an XPCOM 
service call which wraps?


If there would be some secret backdoor to continue using JSAPI that 
would give us a little bit longer before we replace the whole backend, 
I could live with this happening.
This will be configurable, so we should be able to turn it off for 
Thunderbird but turn it on for Firefox. But I don't think that any other 
kind of backdoor would be better than the current situation.


--BDS

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: C++ Standards Committee meeting next week

2013-09-23 Thread Trevor Saunders
Hi,

some things that I've seen that we'd probably like to see end up in the
language include.

- final on data / non virtual member functions
- virtual constants (maybe ability to get at vtable pointer too?)
- ability to say classes should only be used on stack / heap or not used
  in one of those

  It would be nice to know if we can interest people in supporting any
  of those.

  Trev

On Fri, Sep 20, 2013 at 02:50:48PM -0700, Botond Ballo wrote:
> Hi everyone,
> 
> The C++ Standards Committee is meeting in Chicago next week. Their focus will 
> be on C++14, the upcoming version of the C++ standard, as well as some 
> Technical Specifications (specifications for features intended to be 
> standardized but not fully-baked enough to be standardized now) that are also 
> planned for publication in 2014. Presumably there will also be some 
> discussion of the following version of the standard, C++17.
> 
> I will attend this meeting as an observer. I intend to follow the progress of 
> the Concepts Lite proposal [1] which I'm particularly interested in, but I 
> will try to keep up with other goings-on as well (the committee splits up 
> into several sub-groups that meet in parallel over the course of the week).
> 
> I wanted to ask if there's anything anyone would like to know about the 
> upcoming standards that I could find out at the meeting - if so, please let 
> me know and I will do my best to find it out.
> 
> If anyone's interested in the C++ standardization process, you can find more 
> information here [2].
> 
> Thanks,
> Botond
> 
> [1] 
> http://isocpp.org/blog/2013/02/concepts-lite-constraining-templates-with-predicates-andrew-sutton-bjarne-s
> [2] http://isocpp.org/std
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: You want faster builds, don't you?

2013-09-23 Thread Trevor Saunders
> [I also see a clobber build spend > 5 minutes in various configure
> runs, which frustrates me every time I see it - so I minimize the
> shell ;]
> >>>
> >>>We don't have much love for configure either. However, it's only
> >>>contributing a few extra minutes to Windows builds compared with 15+
> >>>minutes that pymake and make traversal is. I hope you understand why
> >>>fixing configure isn't at the top of the priority list at the
> >>>moment.
> >>
> >>I understand it isn't at the top of the priority list, but it's still
> >>worth keeping it in perspective - I see ~6 minutes of configure for ~30
> >>mins total build time - 20% is significant in anyone's language.
> >
> >~6 minutes for a configure?! I just did a configure from a clobber build
> >on Windows on my i7-2600K (2+ year old CPU) and it took 2:10. If you are
> >seeing 6 minutes configure times, you are running an ancient CPU and/or
> >not an SSD or you have something fubar with your setup (such as running
> >a VM inside a VM inside a VM Inception style). If your machine *is*
> >modern and you don't believe you are doing something silly, *please*
> >file a bug so we can get to the root cause.
> 
> I just did it again, and this time I saw 3 minutes 50 seconds from
> "./mach build" until the first non-configure thing happened.  It
> looks like it ran configure 5 times, so might not be what you mean
> by "did a configure".

Well, you could time just running configure by hand.

> I can't explain why I regularly saw > 5 minutes previously, but it's
> still 10% of the build time.  Should I open a bug on that?

Jut  write a patch to not do even more stuff on windows? it probably
isn't terribly hard assuming there's stuff that will always test true or
false on windows that we haven't stopped running there already.  Looking
through configure for such sounds annoying, but not hard.

Trev


> 
> [This is all on an i7-3770 CPU with 16GB ram on a spinning RAID disk
> in a native (ie, non-VM) windows 7 - but a ~35 min build time
> implies it's not particularly slow, right?]
> 
> Mark
> 
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Add-ons Firefox 24 crash due to recent change in JSAPI

2013-09-23 Thread Bobby Holley
On Mon, Sep 23, 2013 at 5:12 AM,  wrote:

> Thanks a lot Bobby.
> My issue is solved after use of nsCxPusher for JSContext.
> Could you suggest me some alternative approach to do the same?
>

An alternative to nsCxPusher? My suggestion is to not use JSAPI, period.
Per another discussion on this list, we're going to stop exporting those
symbols soon.

bholley
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Rendering meeting, (today) Monday 2;30pm PDT ("the early time")

2013-09-23 Thread Milan Sreckovic


The Rendering meeting is about all things Gfx, Image, Layout, and Media.
It takes place every second Monday, alternating between 2:30pm PDT and 5:30pm 
PDT.

The next meeting will take place today Monday, September 23 at 2:30 PM 
US/Pacific
Please add to the agenda: 
https://wiki.mozilla.org/Platform/GFX/2013-September-23#Agenda

San Francisco - Monday, 2:30pm
Winnipeg - Monday, 4:30pm
Toronto - Monday, 5:30pm
GMT/UTC - Monday, 21:30
Paris - Monday, 11:30pm
Taipei - Tuesday, 5:30am
Auckland - Tuesday, 9:30am

Video conferencing:
Vidyo room Graphics (9366)
https://v.mozilla.com/flex.html?roomdirect.html&key=FILzKzPcA6W2

Phone conferencing:
+1 650 903 0800 x92 Conf# 99366
+1 416 848 3114 x92 Conf# 99366
+1 800 707 2533 (pin 369) Conf# 99366

--
- Milan

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: You want faster builds, don't you?

2013-09-23 Thread Benoit Girard
On Mon, Sep 23, 2013 at 12:49 AM, Robert O'Callahan wrote:

> I observe that Visual Studio builds do not spawn one cl process per
> translation unit. Knowing how slow Windows is at spawning processes, I
> suspect the build would be a lot faster if we used a single cl process to
> compile multiple tranlation units.
>
> /MP apparently spawns multiple processes to compile a set of files. That's
> kind of the opposite of what we want here.
>
> I see how compiling several files in the same cl invocation would mess up
> using /showincludes to track dependencies, making this difficult to fix.
> The only possibility I can think of for fixing this is to emit a Visual
> Studio project and make the Visual Studio builder responsible for tracking
> #include dependencies.
>
>
Vlad Ehsan and I did just that. With hacky.mk you can generate a fully
working Visual Studio project which will let you make changes to XUL as
necessary. The problem with MSBuild used by VS is that invokes one process
at a time. So as soon as a group of files requires different flags (and we
aren't even closed to having unified build flag across the tree) they
require a separate invocation of cl.exe by MSBuild and so you end up with
several invocation of cl.exe running serially and your CPU utilization is
far from 100% as the first cl.exe invocation is down to the last 1-2
remaining object file before the next invocation can start. It takes over
an hour to build with VS using hacky.mk. The benefit of specifying a
working VS solution is perfect intellisense. For building it's better to
call a good external build system from your IDE.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: You want faster builds, don't you?

2013-09-23 Thread Joshua Cranmer 🐧

On 9/23/2013 1:10 AM, Gregory Szorc wrote:

~6 minutes for a configure?! I just did a configure from a clobber build
on Windows on my i7-2600K (2+ year old CPU) and it took 2:10. If you are
seeing 6 minutes configure times, you are running an ancient CPU and/or
not an SSD or you have something fubar with your setup (such as running
a VM inside a VM inside a VM Inception style). If your machine *is*
modern and you don't believe you are doing something silly, *please*
file a bug so we can get to the root cause.


I definitely recall 5min+ configures on Windows on my main laptop, which 
is only two or three years old (no SSD, though). Configures are 
*extremely* painful on Windows, though.


--
Joshua Cranmer
Thunderbird and DXR developer
Source code archæologist

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Proposal: stop exporting JS symbols

2013-09-23 Thread Philipp Kewisch

On 9/23/13 2:34 PM, Benjamin Smedberg wrote:

On 9/20/2013 3:12 PM, Philipp Kewisch wrote:

We are not quite ready to get rid of the libical backend right now, so
removing these functions without replacement will cause some
complications for Lightning.


Not knowing this code well, it seems that this code takes a double and
converts it to a JS date object with some timezone conversion. It seems
to me that you could write this code entirely in JS and just pass around
the integer or double values using IDL, to avoid the JSAPI usage
altogether. That might involve always replacing calDateTime with a
JS-implemented wrapper, but that's the kind of technique that we need to
be encouraging in general.

Is that technique a usable workaround for raw JSAPI usage?



The code really just takes a PRTime type value and creates a javascript 
date object out of it. If its a local timezone (floating), then use the 
fields directly, otherwise create a jsdate from the PRTime.


Initially it seems it would be easy to replace calDateTime with a JS 
component and I had started to do this, but unfortunately calDateTime is 
instanciated directly (via constructor, not via xpcom) in a few 
locations in our C code, so replacing only calDateTime with a JS 
implementation wouldn't really work out well. Replacing all the other 
components comes close to using the new backend, which is not ready yet.


If there would be some secret backdoor to continue using JSAPI that 
would give us a little bit longer before we replace the whole backend, I 
could live with this happening.


Philipp
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Proposal: stop exporting JS symbols

2013-09-23 Thread Benjamin Smedberg

On 9/20/2013 3:12 PM, Philipp Kewisch wrote:


What about JS_NewDateObject, JS_NewDateObjectMsec, JS_ObjectIsDate, 
js_DateIsValid? Most of these are in jsapi.h, and we need it in 
Lightning for this code:


http://mxr.mozilla.org/comm-central/source/calendar/base/backend/libical/calDateTime.cpp#597 



We are not quite ready to get rid of the libical backend right now, so 
removing these functions without replacement will cause some 
complications for Lightning.


Not knowing this code well, it seems that this code takes a double and 
converts it to a JS date object with some timezone conversion. It seems 
to me that you could write this code entirely in JS and just pass around 
the integer or double values using IDL, to avoid the JSAPI usage 
altogether. That might involve always replacing calDateTime with a 
JS-implemented wrapper, but that's the kind of technique that we need to 
be encouraging in general.


Is that technique a usable workaround for raw JSAPI usage?

--BDS

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: You want faster builds, don't you?

2013-09-23 Thread Benjamin Smedberg

On 9/23/2013 1:38 AM, Anthony Jones wrote:

On 23/09/13 16:49, Robert O'Callahan wrote:

I see how compiling several files in the same cl invocation would mess
up using /showincludes to track dependencies, making this difficult to
fix. The only possibility I can think of for fixing this is to emit a
Visual Studio project and make the Visual Studio builder responsible for
tracking #include dependencies.

Yes. Difficult.

Lets say you compile 10 files together. If you change any of the header
files for the whole set then you'll need to recompile all of those 10
files.
As long as you have dependency information you don't... you just 
recompile the files that need to be rebuilt.


--BDS

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Add-ons Firefox 24 crash due to recent change in JSAPI

2013-09-23 Thread vasuyadavkrishn
On Thursday, September 19, 2013 10:06:30 AM UTC+5:30, vasuyad...@gmail.com 
wrote:
> Hi 
> 
> 
> 
> We are facing problem with our Add-ons support for FireFox 24.Firefox is 
> crashing. In earlier approach we was using 'JS_GetGlobalObject' to get global 
> object from docShell. 
> https://developer.mozilla.org/en-US/docs/Mozilla/Projects/SpiderMonkey/JSAPI_reference/JS_GetGlobalObject?redirectlocale=en-US&redirectslug=SpiderMonkey%2FJSAPI_Reference%2FJS_GetGlobalObject
>  
> 
> 
> 
> As per given above link- I should use use JS_GetGlobalForObject or 
> JS_GetGlobalForScopeChain instead of JS_GetGlobalObject. I tried both of them 
> and additionally I tried GetNativeGlobal(), GetGlobalJSObject() . but none of 
> them work out. Firefox is crashing in all the cases . 
> 
> 
> 
> Could anybody help me for this issue? what are required changes need to fix 
> this problem. I have paste code for yours reference. 
> 
> 
> 
> 
> 
> 
> 
> bool CFFGBrowser::GetJSGlobalContextAndWindow(nsIDocShell *docShell, 
> JSContext **jsContext, 
> 
> 
> 
> jsval &window)
> 
> 
> 
> {
> 
> 
> 
> bool isSuccess = false;
> 
> 
> 
> nsresult rv;
> 
> 
> 
> 
> 
> do 
> 
> 
> 
> {
> 
> 
> 
> nsCOMPtr< nsIScriptGlobalObject > scriptObj = do_GetInterface( docShell, &rv 
> ); 
> 
> 
> 
> 
> 
> if( NS_FAILED( rv ) )
> 
> 
> 
> {
> 
> 
> 
> break;
> 
> 
> 
> }
> 
> 
> 
> nsCOMPtr< nsIScriptContext > scriptContext = scriptObj->GetContext(); 
> 
> 
> 
> if ( !scriptContext )
> 
> 
> 
> {
> 
> 
> 
> break;
> 
> 
> 
> }
> 
> 
> 
> *jsContext = static_cast( scriptContext->GetNativeContext());
> 
> 
> 
> //JSObject *globalObj1 = scriptContext->GetNativeGlobal();
> 
> 
> 
> JSObject *globalObj = scriptObj->GetGlobalJSObject();
> 
> 
> 
> CLogFile::Info(111, "CFFGBrowser::GetJSGlobalContextAndObject -start...3.2.");
> 
> 
> 
> //JSObject *globalObj = JS_GetArrayPrototype(*jsContext,globalObj1);
> 
> 
> 
> //JS::Rooted globalObj(*jsContext, 
> JS_GetGlobalForScopeChain(*jsContext));
> 
> 
> 
> //JS::RootedObject*globalObj =static_cast( 
> scriptContext->GetNativeGlobal());
> 
> 
> 
> //JSObject *globalObj2 = JS_GetGlobalForScopeChain(*jsContext);
> 
> 
> 
> 
> 
> 
> 
> //JSObject *globalObj3 = JS_GetGlobalForObject(*jsContext,globalObj2);
> 
> 
> 
> //JSObject *globalObj = JS_GetGlobalForScopeChain(scriptContext); 
> 
> 
> 
> if ( !jsContext )
> 
> 
> 
> {
> 
> 
> 
> break;
> 
> 
> 
> }
> 
> 
> 
> 
> 
> if (!globalObj )
> 
> 
> 
> {
> 
> 
> 
> break;
> 
> 
> 
> }
> 
> 
> 
> JSBool isSucceeded = JS_GetProperty( *jsContext, globalObj, "window", &window 
> );
> 
> 
> 
> if ( ( isSucceeded != JS_TRUE ) || ( window == JSVAL_VOID ) )
> 
> 
> 
> {
> 
> 
> 
> break;
> 
> 
> 
> }
> 
> 
> 
>  jsval jsargs[1];
> 
>//jsval jscaptureSSLText   = JSVAL_NULL, rval = JSVAL_NULL;
> 
>jsval rval = JSVAL_NULL;
> 
>jsargs[0] = INT_TO_JSVAL(0);
> 
>//JS::HandleValue captureSSLVal = JS::RootedValue(jsContext, 
> jscaptureSSLText);
> 
>JS::RootedValue captureSSLVal(*jsContext, jsargs[0]);
> 
>JS::RootedValue windowVal(*jsContext, window);
> 
>JS::RootedObject windowObj (*jsContext, JSVAL_TO_OBJECT(window));
> 
>//JS_CallFunctionName
> 
>JSBool isSucceededd = JS_CallFunctionName(*jsContext,  
> 
> globalObj,
> 
> "KeynoteCaptureSSLText",
> 
> 1,   
> 
> jsargs, 
> 
> &rval);
> 
>  isSuccess = true;
> 
> 
> 
> }while(false);
> 
> 
> 
> return isSuccess;
> 
> 
> 
> }
> 
> 
> 
> 
> 
> 
> 
> Regards
> 
> 
> 
> Vasu

Thanks a lot Bobby.
My issue is solved after use of nsCxPusher for JSContext.
Could you suggest me some alternative approach to do the same?


Regards
Vasu 
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: You want faster builds, don't you?

2013-09-23 Thread Mark Hammond

On 23/09/2013 4:10 PM, Gregory Szorc wrote:

thread and to every armchair quarterback that shows up.


Really? :(


[I also see a clobber build spend > 5 minutes in various configure
runs, which frustrates me every time I see it - so I minimize the
shell ;]


We don't have much love for configure either. However, it's only
contributing a few extra minutes to Windows builds compared with 15+
minutes that pymake and make traversal is. I hope you understand why
fixing configure isn't at the top of the priority list at the
moment.


I understand it isn't at the top of the priority list, but it's still
worth keeping it in perspective - I see ~6 minutes of configure for ~30
mins total build time - 20% is significant in anyone's language.


~6 minutes for a configure?! I just did a configure from a clobber build
on Windows on my i7-2600K (2+ year old CPU) and it took 2:10. If you are
seeing 6 minutes configure times, you are running an ancient CPU and/or
not an SSD or you have something fubar with your setup (such as running
a VM inside a VM inside a VM Inception style). If your machine *is*
modern and you don't believe you are doing something silly, *please*
file a bug so we can get to the root cause.


I just did it again, and this time I saw 3 minutes 50 seconds from 
"./mach build" until the first non-configure thing happened.  It looks 
like it ran configure 5 times, so might not be what you mean by "did a 
configure".


I can't explain why I regularly saw > 5 minutes previously, but it's 
still 10% of the build time.  Should I open a bug on that?


[This is all on an i7-3770 CPU with 16GB ram on a spinning RAID disk in 
a native (ie, non-VM) windows 7 - but a ~35 min build time implies it's 
not particularly slow, right?]


Mark

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform