Re: [Wikitech-l] HipHop forum/mailing list (was HipHop VM support....)

2013-05-12 Thread Rob Lanphier
On Sun, May 12, 2013 at 8:02 PM, Matthew Flaschen
 wrote:
> On 05/12/2013 06:24 PM, Rob Lanphier wrote:
>> On Sun, May 12, 2013 at 10:56 AM, Dmitriy Sintsov  wrote:
>>> On 12.05.2013 1:18, Tyler Romeo wrote:
 FWIW, here is what I have so far: http://pastebin.com/hUQ92DfB
>>>
>>> Perhaps you should send the link to HipHop developers (or to their list, if
>>> there's any).
>>
>> Unfortunately, they don't have a mailing list yet.  We discussed that
>> with them in the meeting we had this past week, and we agreed it would
>> be a good idea to have one, hence this request:
>> https://bugzilla.wikimedia.org/show_bug.cgi?id=48391
>
> In the meantime, they do accept pull requests
> (https://github.com/facebook/hiphop-php/pulls).

...and now the mailing list exists!  (thanks Thehelpfulone!):
https://lists.wikimedia.org/mailman/listinfo/hiphop

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop forum/mailing list (was HipHop VM support....)

2013-05-12 Thread Matthew Flaschen
On 05/12/2013 06:24 PM, Rob Lanphier wrote:
> On Sun, May 12, 2013 at 10:56 AM, Dmitriy Sintsov  wrote:
>> On 12.05.2013 1:18, Tyler Romeo wrote:
>>> FWIW, here is what I have so far: http://pastebin.com/hUQ92DfB
>>
>> Perhaps you should send the link to HipHop developers (or to their list, if
>> there's any).
> 
> Unfortunately, they don't have a mailing list yet.  We discussed that
> with them in the meeting we had this past week, and we agreed it would
> be a good idea to have one, hence this request:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=48391

In the meantime, they do accept pull requests
(https://github.com/facebook/hiphop-php/pulls).

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-12 Thread Tim Starling
On 10/05/13 22:05, Jeroen De Dauw wrote:
> Question: Is it possible to use HipHop only for part of a codebase. For
> instance, if there is some MW extension that depends on a third party
> library we have no control over that makes uses of SPL, would this cause
> problems?

It would be possible to have a separate cluster running regular PHP,
and to access it via internal API requests or by frontend routing,
similar to how we do image scaling. But the justification would need
to be pretty strong to justify that complexity.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-12 Thread Tyler Romeo
On Sun, May 12, 2013 at 1:56 PM, Dmitriy Sintsov  wrote:

> Perhaps you should send the link to HipHop developers (or to their list,
> if there's any).


Once I finish it and it's working I plan on doing just that.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HipHop forum/mailing list (was HipHop VM support....)

2013-05-12 Thread Rob Lanphier
On Sun, May 12, 2013 at 10:56 AM, Dmitriy Sintsov  wrote:
> On 12.05.2013 1:18, Tyler Romeo wrote:
>> FWIW, here is what I have so far: http://pastebin.com/hUQ92DfB
>
> Perhaps you should send the link to HipHop developers (or to their list, if
> there's any).

Unfortunately, they don't have a mailing list yet.  We discussed that
with them in the meeting we had this past week, and we agreed it would
be a good idea to have one, hence this request:
https://bugzilla.wikimedia.org/show_bug.cgi?id=48391

In the meantime, there is a Facebook page you can post to:
https://www.facebook.com/hphp

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-12 Thread Dmitriy Sintsov

On 12.05.2013 1:18, Tyler Romeo wrote:

FWIW, here is what I have so far: http://pastebin.com/hUQ92DfB

I haven't tested it yet because my PHP environment is not behaving, and the
only class I haven't implemented fully is SplHeap.


Perhaps you should send the link to HipHop developers (or to their list, 
if there's any).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-11 Thread Tyler Romeo
FWIW, here is what I have so far: http://pastebin.com/hUQ92DfB

I haven't tested it yet because my PHP environment is not behaving, and the
only class I haven't implemented fully is SplHeap.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Sat, May 11, 2013 at 8:57 AM, Chad  wrote:

> On Sat, May 11, 2013 at 3:31 AM, Dmitriy Sintsov 
> wrote:
> > On 10.05.2013 17:58, Chad wrote:
> >>
> >> On Fri, May 10, 2013 at 8:05 AM, Jeroen De Dauw  >
> >> wrote:
> >>>
> >>> Hey,
> >>>
> >>> I can see why SPL might require extra work in HipHop to support. At the
> >>> same time I find it somewhat unfortunate this means one cannot use the
> >>> Standard PHP Library.
> >>>
> >> Yeah, but I think it's a workable issue. And the HH team seems very
> >> amenable to feature requests (and patches!), so implementing parts of
> >> the SPL are certainly possible over the long term.
> >>
> >> As Tim points out, for ArrayObject and filter_var() it's non trivial to
> >> implement (even Zend's implementation of the former is 2000+ LOC).
> >>
> > System and development software, such as OS, compilers, language
> libraries,
> > different kinds of VM's and so on are really huge and 2000+ lines of code
> > actually is a *little* amount.
> > It's not a framework or wiki. It's development software written in
> low-level
> > language.
>
> Indeed, 2000 isn't really a lot, I was just trying to give an order of
> magnitude
> so people wouldn't think it was like <50 LOC or something.
>
> -Chad
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-11 Thread Chad
On Sat, May 11, 2013 at 3:31 AM, Dmitriy Sintsov  wrote:
> On 10.05.2013 17:58, Chad wrote:
>>
>> On Fri, May 10, 2013 at 8:05 AM, Jeroen De Dauw 
>> wrote:
>>>
>>> Hey,
>>>
>>> I can see why SPL might require extra work in HipHop to support. At the
>>> same time I find it somewhat unfortunate this means one cannot use the
>>> Standard PHP Library.
>>>
>> Yeah, but I think it's a workable issue. And the HH team seems very
>> amenable to feature requests (and patches!), so implementing parts of
>> the SPL are certainly possible over the long term.
>>
>> As Tim points out, for ArrayObject and filter_var() it's non trivial to
>> implement (even Zend's implementation of the former is 2000+ LOC).
>>
> System and development software, such as OS, compilers, language libraries,
> different kinds of VM's and so on are really huge and 2000+ lines of code
> actually is a *little* amount.
> It's not a framework or wiki. It's development software written in low-level
> language.

Indeed, 2000 isn't really a lot, I was just trying to give an order of magnitude
so people wouldn't think it was like <50 LOC or something.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-11 Thread Dmitriy Sintsov

On 10.05.2013 23:51, Antoine Musso wrote:

Le 10/05/13 03:10, Tim Starling a écrit :

There's a few other SPL features that we don't use at the moment and
we should avoid introducing if possible due to lack of support in HipHop:



I wish we actually used Spl :-]  They are nice classes providing all
kind of useful features: http://php.net/manual/en/book.spl.php
Yes it is strange that Facebook does not need SPL. Especially because 
since PHP 5.3 SPL is not optional anymore.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-11 Thread Dmitriy Sintsov

On 10.05.2013 17:58, Chad wrote:

On Fri, May 10, 2013 at 8:05 AM, Jeroen De Dauw  wrote:

Hey,

I can see why SPL might require extra work in HipHop to support. At the
same time I find it somewhat unfortunate this means one cannot use the
Standard PHP Library.


Yeah, but I think it's a workable issue. And the HH team seems very
amenable to feature requests (and patches!), so implementing parts of
the SPL are certainly possible over the long term.

As Tim points out, for ArrayObject and filter_var() it's non trivial to
implement (even Zend's implementation of the former is 2000+ LOC).

System and development software, such as OS, compilers, language 
libraries, different kinds of VM's and so on are really huge and 2000+ 
lines of code actually is a *little* amount.
It's not a framework or wiki. It's development software written in 
low-level language.

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-10 Thread Antoine Musso
Le 10/05/13 03:10, Tim Starling a écrit :
> There's a few other SPL features that we don't use at the moment and
> we should avoid introducing if possible due to lack of support in HipHop:



I wish we actually used Spl :-]  They are nice classes providing all
kind of useful features: http://php.net/manual/en/book.spl.php

Although I fully agree we should not add more to be able to use HipHop,
I think we should also look at having them implemented in HipHop.

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-10 Thread Antoine Musso
Le 10/05/13 03:10, Tim Starling a écrit :
> So I'd like to suggest that we refuse new code submissions in Gerrit
> that use these features, if they are targeted for WMF production, 

This can be done using CodeSniffer.  We can write a specific style
standard such as "WikimediaProduction" that will spurt error whenever
some function / class is encountered.

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-10 Thread Chad
On Fri, May 10, 2013 at 8:05 AM, Jeroen De Dauw  wrote:
> Hey,
>
> I can see why SPL might require extra work in HipHop to support. At the
> same time I find it somewhat unfortunate this means one cannot use the
> Standard PHP Library.
>

Yeah, but I think it's a workable issue. And the HH team seems very
amenable to feature requests (and patches!), so implementing parts of
the SPL are certainly possible over the long term.

As Tim points out, for ArrayObject and filter_var() it's non trivial to
implement (even Zend's implementation of the former is 2000+ LOC).

> Question: Is it possible to use HipHop only for part of a codebase. For
> instance, if there is some MW extension that depends on a third party
> library we have no control over that makes uses of SPL, would this cause
> problems?
>

I don't think so, as HipHop is a standalone implementation of PHP and
doesn't even really know about the php binary/libraries.

> Suggestion: If such a policy is implemented, I'd be great to have a job
> that checks for violations of it on our CI.
>

This is a very good idea, and something I'll take a stab at handling.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-10 Thread Chad
On Fri, May 10, 2013 at 9:49 AM, Max Semenik  wrote:
> On 10.05.2013, 5:10 Tim wrote:
>
>> Several people from the HipHop team at Facebook just met with several
>> people from WMF. Also, in the last couple of days, I've been doing
>> some research into what it would take to make MediaWiki support HipHop
>> VM. The answer is: not very much.
>
>> There's two features that we use, mostly in extensions, that the
>> Facebook people are not keen to implement due to their complexity:
>> ArrayObject and filter_var(). It seems that it would be much easier
>> for us to stop using them than for those features to be implemented in
>> HipHop.
>
>> So I'd like to suggest that we refuse new code submissions in Gerrit
>> that use these features, if they are targeted for WMF production, and
>> that we start work on migrating away from the existing uses of those
>> features.
>
>> There's a few other SPL features that we don't use at the moment and
>> we should avoid introducing if possible due to lack of support in HipHop:
>
> What about namespaces? Last time I heard HH didn't support them - is
> it safe to assume that support for them is coming?
>

Yes. As I understood it yesterday, this is just about to land in
master.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-10 Thread Max Semenik
On 10.05.2013, 5:10 Tim wrote:

> Several people from the HipHop team at Facebook just met with several
> people from WMF. Also, in the last couple of days, I've been doing
> some research into what it would take to make MediaWiki support HipHop
> VM. The answer is: not very much.

> There's two features that we use, mostly in extensions, that the
> Facebook people are not keen to implement due to their complexity:
> ArrayObject and filter_var(). It seems that it would be much easier
> for us to stop using them than for those features to be implemented in
> HipHop.

> So I'd like to suggest that we refuse new code submissions in Gerrit
> that use these features, if they are targeted for WMF production, and
> that we start work on migrating away from the existing uses of those
> features.

> There's a few other SPL features that we don't use at the moment and
> we should avoid introducing if possible due to lack of support in HipHop:

What about namespaces? Last time I heard HH didn't support them - is
it safe to assume that support for them is coming?

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-10 Thread Jeroen De Dauw
Hey,

I can see why SPL might require extra work in HipHop to support. At the
same time I find it somewhat unfortunate this means one cannot use the
Standard PHP Library.

Question: Is it possible to use HipHop only for part of a codebase. For
instance, if there is some MW extension that depends on a third party
library we have no control over that makes uses of SPL, would this cause
problems?

Suggestion: If such a policy is implemented, I'd be great to have a job
that checks for violations of it on our CI.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-09 Thread Tyler Romeo
On Thu, May 9, 2013 at 9:57 PM, Tim Starling wrote:

> If you can write a complete and accurate compatibility class in pure
> PHP, then it can be included in HipHop in the system/classes directory.
>

Interesting. I'll try and work on this.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-09 Thread Tim Starling
On 10/05/13 11:48, Tyler Romeo wrote:
> On Thu, May 9, 2013 at 9:46 PM, Tim Starling wrote:
> 
>> You can always implement the parts you need in pure PHP.
> 
> 
> True. It might be worthwhile to make some sort of Spl compatibility library
> that loads in PHP versions of those classes if they do not exists. That way
> we can use them without dropping HHVM support.

If you can write a complete and accurate compatibility class in pure
PHP, then it can be included in HipHop in the system/classes directory.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-09 Thread Tyler Romeo
On Thu, May 9, 2013 at 9:46 PM, Tim Starling wrote:

> You can always implement the parts you need in pure PHP.


True. It might be worthwhile to make some sort of Spl compatibility library
that loads in PHP versions of those classes if they do not exists. That way
we can use them without dropping HHVM support.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-09 Thread Tim Starling
On 10/05/13 11:28, Daniel Friesen wrote:
> On Thu, 09 May 2013 18:10:53 -0700, Tim Starling
>  wrote:
> 
>> * SplDoublyLinkedList
>> * SplFixedArray
>> * SplHeap
>> * SplMaxHeap
>> * SplMinHeap
>> * SplPriorityQueue
>> * SplQueue
>> * SplStack
> 
> It would be nice if HHVM would support these. I'm not sure what is so
> complex about these that HHVM can implement array() but not these
> simple classes.
> 
> While working on my skin rewrite ideas, trying to implement the
> template syntax parsing was much MUCH more readable and less prone to
> mistakes (like those that would occur if php cloned the array() in a
> place it shouldn't cause I typed something wrong) when working with
> SplDoublyLinkedList, SplQueue, and SplStack.

You can always implement the parts you need in pure PHP.

-- Tim Starling




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-09 Thread Daniel Friesen
On Thu, 09 May 2013 18:10:53 -0700, Tim Starling   
wrote:



* SplDoublyLinkedList
* SplFixedArray
* SplHeap
* SplMaxHeap
* SplMinHeap
* SplPriorityQueue
* SplQueue
* SplStack


It would be nice if HHVM would support these. I'm not sure what is so  
complex about these that HHVM can implement array() but not these simple  
classes.


While working on my skin rewrite ideas, trying to implement the template  
syntax parsing was much MUCH more readable and less prone to mistakes  
(like those that would occur if php cloned the array() in a place it  
shouldn't cause I typed something wrong) when working with  
SplDoublyLinkedList, SplQueue, and SplStack.



-- Tim Starling


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HipHop VM support: ArrayObject and filter_var()

2013-05-09 Thread Tim Starling
Several people from the HipHop team at Facebook just met with several
people from WMF. Also, in the last couple of days, I've been doing
some research into what it would take to make MediaWiki support HipHop
VM. The answer is: not very much.

There's two features that we use, mostly in extensions, that the
Facebook people are not keen to implement due to their complexity:
ArrayObject and filter_var(). It seems that it would be much easier
for us to stop using them than for those features to be implemented in
HipHop.

So I'd like to suggest that we refuse new code submissions in Gerrit
that use these features, if they are targeted for WMF production, and
that we start work on migrating away from the existing uses of those
features.

There's a few other SPL features that we don't use at the moment and
we should avoid introducing if possible due to lack of support in HipHop:

* CachingIterator
* EmptyIterator
* GlobIterator
* InfiniteIterator
* LimitIterator
* MultipleIterator
* NoRewindIterator
* ParentIterator
* RecursiveArrayIterator
* RecursiveCachingIterator
* RecursiveFilterIterator
* RecursiveRegexIterator
* RecursiveTreeIterator
* RegexIterator
* SplDoublyLinkedList
* SplFixedArray
* SplHeap
* SplMaxHeap
* SplMinHeap
* SplPriorityQueue
* SplQueue
* SplStack
* SplTempFileObject

We are not yet promising that we are indeed going to start using
HipHop in WMF production, and we don't have any timetables. But HipHop
has evolved to the point where supporting it is almost trivial, at
least for test installations, so I think it makes sense to establish
policies which will avoid making migration to HipHop more difficult.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop

2011-04-05 Thread Magnus Manske
On Tue, Apr 5, 2011 at 10:07 PM, David Gerard  wrote:
> On 5 April 2011 21:29, Aryeh Gregor  wrote:
>
>> To the contrary, I'd expect JavaScript in the most recent version of
>> any browser (even IE) to be *much* faster than PHP, maybe ten times
>> faster on real-world tasks.  All browsers now use JIT compilation for
>> JavaScript, and have been competing intensively on raw JavaScript
>> speed for the last three years or so.  There are no drop-in
>> alternative PHP implementations, so PHP is happy sticking with a
>> ridiculously slow interpreter forever.
>
>
> So if we machine-translate the parser into JS, we can get the user to
> do the work and everyone wins! [*]
>
> (Magnus, did you do something like this for WYSIFTW?)

Nope, all hand-rolled, just like my last 50 or so parser wannabe
implementations ;-)

(this one has the advantage of a "dunno what this is, just keep the
wikitext" fallback, which helped a lot)

Magnus

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop

2011-04-05 Thread David Gerard
On 5 April 2011 21:29, Aryeh Gregor  wrote:

> To the contrary, I'd expect JavaScript in the most recent version of
> any browser (even IE) to be *much* faster than PHP, maybe ten times
> faster on real-world tasks.  All browsers now use JIT compilation for
> JavaScript, and have been competing intensively on raw JavaScript
> speed for the last three years or so.  There are no drop-in
> alternative PHP implementations, so PHP is happy sticking with a
> ridiculously slow interpreter forever.


So if we machine-translate the parser into JS, we can get the user to
do the work and everyone wins! [*]

(Magnus, did you do something like this for WYSIFTW?)


- d.

[*] if they're using a recent browser on a recent computer, etc etc, ymmv.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop

2011-04-05 Thread Aryeh Gregor
On Tue, Apr 5, 2011 at 9:02 AM, Magnus Manske
 wrote:
> Yes, it doesn't do template/variable replacing, and it's probably full
> of corner cases that break; OTOH, it's JavaScript running in a
> browser, which should make it much slower than a dedicated server
> setup running precompiled PHP.

To the contrary, I'd expect JavaScript in the most recent version of
any browser (even IE) to be *much* faster than PHP, maybe ten times
faster on real-world tasks.  All browsers now use JIT compilation for
JavaScript, and have been competing intensively on raw JavaScript
speed for the last three years or so.  There are no drop-in
alternative PHP implementations, so PHP is happy sticking with a
ridiculously slow interpreter forever.

Alioth's language benchmarks show JavaScript in Chrome's V8 (they
don't say what version) as being at least eight times faster than PHP
on most of its benchmarks, although it's slower on two:

http://shootout.alioth.debian.org/u32/benchmark.php?test=all&lang=v8&lang2=php

Obviously not very scientific, but it gives you an idea.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Paul Copperman
2011/4/5 Magnus Manske :
> So is the time spent with the actual expansion (replacing variables),
> or getting the wikitext for n-depth template recursion? Or is it the
> parser functions?
>
Well, getting the wikitext shouldn't be very expensive as it is cached
in several cache layers. Basically it's just expanding many, many
preprocessor nodes. A while ago I did a bit of testing with my
template tool on dewiki[1] and found that wikimedia servers spend
approx. 0.2 ms per expanded node part, although there's of course much
variation depending on current load. My tool counts 303,905 nodes when
expanding [[Barack Obama]] so that would account for about 60 s of
render time. As already said, YMMV.

Paul Copperman

[1] ,
you can test it with http://de.wikipedia.org/w/index.php?action=raw&title=Benutzer:P.Copp/scripts/templateutil.js&ctype=text/javascript')>
and a click on "Template tools" in the toolbox

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Magnus Manske
On Tue, Apr 5, 2011 at 3:30 PM, Paul Copperman
 wrote:
> 2011/4/5 Magnus Manske :
>> For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac,
>> and in 4.4 sec on my MacBook (both Chrome 12).
>>
>> Yes, it doesn't do template/variable replacing, and it's probably full
>> of corner cases that break; OTOH, it's JavaScript running in a
>> browser, which should make it much slower than a dedicated server
>> setup running precompiled PHP.
>>
>
> Seriously, the bulk of the time needed to parse these enwiki articles
> is for template expansion. If you pre-expand them, taking care that
> also the templates in ... tags get expanded, MediaWiki can
> parse the article in a few seconds, 3-4 on my laptop.

So is the time spent with the actual expansion (replacing variables),
or getting the wikitext for n-depth template recursion? Or is it the
parser functions?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Paul Copperman
2011/4/5 Magnus Manske :
> For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac,
> and in 4.4 sec on my MacBook (both Chrome 12).
>
> Yes, it doesn't do template/variable replacing, and it's probably full
> of corner cases that break; OTOH, it's JavaScript running in a
> browser, which should make it much slower than a dedicated server
> setup running precompiled PHP.
>

Seriously, the bulk of the time needed to parse these enwiki articles
is for template expansion. If you pre-expand them, taking care that
also the templates in ... tags get expanded, MediaWiki can
parse the article in a few seconds, 3-4 on my laptop.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Tim Starling
On 04/05/2011 10:43 PM, Ilmari Karonen wrote:
> On 04/05/2011 02:42 AM, Tim Starling wrote:
>> You can't include the class files for compiled classes, they all exist
>> at startup and you get a redeclaration error if you try. I explained
>> this in the documentation page on mediawiki.org which has now appeared.
>>
>> http://www.mediawiki.org/wiki/HipHop
>>
>> The autoloader is not used. It doesn't matter where the class_exists()
>> is. HipHop scans the entire codebase for class_exists() at compile
>> time, and breaks any classes it finds, whether the or not the
>> class_exists() is reachable.
>
> These both really sound like bugs in HipHop.  I've no idea how hard it
> would be to fix them, but are we reporting them at least?

I reported the class_exists() thing. The fact that classes exist on 
startup is more of a design decision than a bug. I don't want to annoy 
the HipHop devs with frivolous bug reports, we need a lot of favours 
from them.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Tim Starling
On 04/05/2011 10:39 PM, Ilmari Karonen wrote:
> On 04/05/2011 05:47 AM, Tim Starling wrote:
>>
>> Speaking of "fast", I did a quick benchmark of the [[Barack Obama]]
>> article with templates pre-expanded. It took 22 seconds in HipHop and
>> 112 seconds in Zend, which is not bad, for a first attempt. I reckon
>> it would do better if a few of the regular expressions were replaced
>> with tight loops.
>
> Hmm, does HipHop precompile regexen?

No. Its regex handling is the same as Zend's. It uses PCRE with a 
cache of compiled regexes, generated at runtime.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Domas Mituzas
> For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac,
> and in 4.4 sec on my MacBook (both Chrome 12).

Try parsing [[Barack Obama]], 4s spent on parsing a redirect page is
quite a lot (albeit it has some vandalism)
OTOH, my macbook shows raw wikitext pretty much immediately. Parser is
definitely the issue.

Domas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop: need developers

2011-04-05 Thread Chad
On Tue, Apr 5, 2011 at 12:58 AM, Tim Starling  wrote:
> I need to get back to work on reviews and deployments and other such
> strategic things. I hope the work I've done on HipHop support is
> enough to get the project started.
>
> Initial benchmarks are showing a 5x speedup for article parsing, which
> is better than we had hoped for. So there are big gains to be had
> here, both for small users and for large websites like Wikimedia and
> Wikia.
>
> There's a list of things that still need doing under the "to do"
> heading at:
>
> http://www.mediawiki.org/wiki/HipHop
>
> I'll be available to support any developers who want to work on this.
>
> -- Tim Starling
>

This is something I'm interested in and hope to put a little more time
into in the near future once the 1.17 release is behind us.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Magnus Manske
On Tue, Apr 5, 2011 at 7:45 AM, Ashar Voultoiz  wrote:
> On 05/04/11 04:47, Tim Starling wrote:
>  > Speaking of "fast", I did a quick benchmark of the [[Barack Obama]]
>  > article with templates pre-expanded. It took 22 seconds in HipHop and
>  > 112 seconds in Zend, which is not bad, for a first attempt. I reckon
>  > it would do better if a few of the regular expressions were replaced
>  > with tight loops.
> 
> I have imported in my local wiki the english [[Barack Obama]] article
> with all its dependencies. I can not have it parsed under either 256MB
> max memory or 1 minute max execution time limits.
> Hiphop helps, but there is still a highly broken code somewhere in our
> PHP source code.  No matter how much hacks we throw at bad code, the
> algorithm still need to get fixed.

For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac,
and in 4.4 sec on my MacBook (both Chrome 12).

Yes, it doesn't do template/variable replacing, and it's probably full
of corner cases that break; OTOH, it's JavaScript running in a
browser, which should make it much slower than a dedicated server
setup running precompiled PHP.

So, maybe another hard look at the MediaWiki parser is in order?

Cheers,
Magnus

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop

2011-04-05 Thread Ilmari Karonen
On 04/05/2011 02:42 AM, Tim Starling wrote:
> On 04/05/2011 07:31 AM, Platonides wrote:
>> Having to create a reflection class and look for exceptions just to
>> check for class existance is really ugly.
>> However, looking at https://github.com/facebook/hiphop-php/issues/314 it
>> seems to be a declaration before use problem (even though class_exists
>> shouldn't be declaring it). I suppose that we could work around that by
>> including all classes at the beginning instead of using the AutoLoader,
>> which shouldn't be needed for compiled code.
>
> You can't include the class files for compiled classes, they all exist
> at startup and you get a redeclaration error if you try. I explained
> this in the documentation page on mediawiki.org which has now appeared.
>
> http://www.mediawiki.org/wiki/HipHop
>
> The autoloader is not used. It doesn't matter where the class_exists()
> is. HipHop scans the entire codebase for class_exists() at compile
> time, and breaks any classes it finds, whether the or not the
> class_exists() is reachable.

These both really sound like bugs in HipHop.  I've no idea how hard it 
would be to fix them, but are we reporting them at least?

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Ilmari Karonen
On 04/05/2011 05:47 AM, Tim Starling wrote:
>
> Speaking of "fast", I did a quick benchmark of the [[Barack Obama]]
> article with templates pre-expanded. It took 22 seconds in HipHop and
> 112 seconds in Zend, which is not bad, for a first attempt. I reckon
> it would do better if a few of the regular expressions were replaced
> with tight loops.

Hmm, does HipHop precompile regexen?

-- 
Ilmari Karonen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Tim Starling
On 04/05/2011 04:45 PM, Ashar Voultoiz wrote:
> On 05/04/11 04:47, Tim Starling wrote:
>   >  Speaking of "fast", I did a quick benchmark of the [[Barack Obama]]
>   >  article with templates pre-expanded. It took 22 seconds in HipHop and
>   >  112 seconds in Zend, which is not bad, for a first attempt. I reckon
>   >  it would do better if a few of the regular expressions were replaced
>   >  with tight loops.
> 
> I have imported in my local wiki the english [[Barack Obama]] article
> with all its dependencies. I can not have it parsed under either 256MB
> max memory or 1 minute max execution time limits.
> Hiphop helps, but there is still a highly broken code somewhere in our
> PHP source code.  No matter how much hacks we throw at bad code, the
> algorithm still need to get fixed.

Let me know when you find that broken code. Try using the profiling 
feature from xdebug to narrow down the causes of CPU usage.

>> Also, browsing the generated source turns up silly things like:
>>
>> if (equal(switch2, (NAMSTR(s_ss34c5c84c, "currentmonth"
>>  goto case_2_0;
>> if (equal(switch2, (NAMSTR(s_ss55b88086, "currentmonth1"
>>  goto case_2_1;
> 
>> 71 string comparisons in total, in quite a hot function. A hashtable
>> would probably be better.
>
> As I understand it, hiphop is just a straight translator from PHP to C
> language but does not actually enhance the code. Just like you would use
> Google translator instead of Charles Baudelaire [1].

It's not just a translator, it's also a reimplementation of the bulk 
of the PHP core.

> Your code above comes from Parse.php getVariableValue() which use a long
> switch() structure to map a string to a method call. If you manage to
> find unoptimized code in the translated code, fix it in the PHP source
> code :-b

In Zend PHP, switch statements are implemented by making a hashtable 
at compile time, and then doing a hashtable lookup at runtime. The 
HipHop implementation is less efficient. So getVariableValue() is not 
broken, it's just not optimised for HipHop.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Ashar Voultoiz
On 05/04/11 04:47, Tim Starling wrote:
 > Speaking of "fast", I did a quick benchmark of the [[Barack Obama]]
 > article with templates pre-expanded. It took 22 seconds in HipHop and
 > 112 seconds in Zend, which is not bad, for a first attempt. I reckon
 > it would do better if a few of the regular expressions were replaced
 > with tight loops.

I have imported in my local wiki the english [[Barack Obama]] article 
with all its dependencies. I can not have it parsed under either 256MB 
max memory or 1 minute max execution time limits.
Hiphop helps, but there is still a highly broken code somewhere in our 
PHP source code.  No matter how much hacks we throw at bad code, the 
algorithm still need to get fixed.

> Also, browsing the generated source turns up silly things like:
>
> if (equal(switch2, (NAMSTR(s_ss34c5c84c, "currentmonth"
>   goto case_2_0;
> if (equal(switch2, (NAMSTR(s_ss55b88086, "currentmonth1"
>   goto case_2_1;

> 71 string comparisons in total, in quite a hot function. A hashtable
> would probably be better.

As I understand it, hiphop is just a straight translator from PHP to C 
language but does not actually enhance the code. Just like you would use 
Google translator instead of Charles Baudelaire [1].

Your code above comes from Parse.php getVariableValue() which use a long 
switch() structure to map a string to a method call. If you manage to 
find unoptimized code in the translated code, fix it in the PHP source 
code :-b

[1] French poet, probably best known in UK/US for his work on 
translating Edgar Allan Poe novels from English to French.

-- 
Ashar Voultoiz


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] HipHop: need developers

2011-04-04 Thread Tim Starling
I need to get back to work on reviews and deployments and other such 
strategic things. I hope the work I've done on HipHop support is 
enough to get the project started.

Initial benchmarks are showing a 5x speedup for article parsing, which 
is better than we had hoped for. So there are big gains to be had 
here, both for small users and for large websites like Wikimedia and 
Wikia.

There's a list of things that still need doing under the "to do" 
heading at:

http://www.mediawiki.org/wiki/HipHop

I'll be available to support any developers who want to work on this.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Brandon Harris

I am surely not the only person who gets Blondie's "Rapture" stuck in 
their head whenever they see the topic for this thread?

No?

Ugh.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Tim Starling
On 03/30/2011 09:22 AM, Magnus Manske wrote:
> Plus, free C++ MediaWiki parser ;-)
>
> Seriously, there should be a way to turn the entire package into a
> (huge) library; maybe transpile it and then replace the C++ code for
> index.php with a manually written library interface?

HipHop has a library generation feature. It even has an option to 
provide public interfaces with human-readable names.

> Offline readers, scientific analysis tools, etc. could profit
> massively from an always-current, fast C++ library...

Speaking of "fast", I did a quick benchmark of the [[Barack Obama]] 
article with templates pre-expanded. It took 22 seconds in HipHop and 
112 seconds in Zend, which is not bad, for a first attempt. I reckon 
it would do better if a few of the regular expressions were replaced 
with tight loops.

Also, browsing the generated source turns up silly things like:

if (equal(switch2, (NAMSTR(s_ss34c5c84c, "currentmonth"
goto case_2_0;
if (equal(switch2, (NAMSTR(s_ss55b88086, "currentmonth1"
goto case_2_1;
if (equal(switch2, (NAMSTR(s_ss0ccbf467, "currentmonthname"
goto case_2_2;
if (equal(switch2, (NAMSTR(s_ss513d5737, "currentmonthnamegen"
goto case_2_3;
if (equal(switch2, (NAMSTR(s_ss004d8db5, "currentmonthabbrev"
goto case_2_4;
if (equal(switch2, (NAMSTR(s_ssf9584d41, "currentday"
goto case_2_5;

71 string comparisons in total, in quite a hot function. A hashtable 
would probably be better.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Tim Starling
On 04/05/2011 07:31 AM, Platonides wrote:
> Having to create a reflection class and look for exceptions just to
> check for class existance is really ugly.
> However, looking at https://github.com/facebook/hiphop-php/issues/314 it
> seems to be a declaration before use problem (even though class_exists
> shouldn't be declaring it). I suppose that we could work around that by
> including all classes at the beginning instead of using the AutoLoader,
> which shouldn't be needed for compiled code.

You can't include the class files for compiled classes, they all exist 
at startup and you get a redeclaration error if you try. I explained 
this in the documentation page on mediawiki.org which has now appeared.

http://www.mediawiki.org/wiki/HipHop

The autoloader is not used. It doesn't matter where the class_exists() 
is. HipHop scans the entire codebase for class_exists() at compile 
time, and breaks any classes it finds, whether the or not the 
class_exists() is reachable.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Tim Starling
On 04/05/2011 04:31 AM, Inez Korczynski wrote:
> Hi Tim,
>
> I have no problem running foo.php and test.php that you sent in hphpi, but
> also it compiles and run without any problems with hphp. What command
> exactly do you use to compile and then execute?

To compile:

hphp --target=cpp --format=exe --input-dir=. \
-i class-test.php -i class-test-2.php \
-i class-test-3.php -i define-test.php \
-c../compiler.conf --parse-on-demand=true \
--program=test --output-dir=build --log=4

where ../compiler.conf is the configuration file I checked in to 
subversion in maintenance/hiphop, class-test*.php are the various 
class declaration tests.

To execute:

build/test -f class-test.php

etc.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Platonides
Tim Starling wrote:
> On 03/29/2011 10:48 AM, Platonides wrote:
>> I was expecting this the week hip-hop hit. What would be required "to
>> target hip-hop"? How does that differ from working from Zend?
> 
> I've explored the issues and made some initial changes to my working 
> copy. I'm now waiting for it to compile, and once it's tested, I'll 
> commit it.


Having to create a reflection class and look for exceptions just to
check for class existance is really ugly.
However, looking at https://github.com/facebook/hiphop-php/issues/314 it
seems to be a declaration before use problem (even though class_exists
shouldn't be declaring it). I suppose that we could work around that by
including all classes at the beginning instead of using the AutoLoader,
which shouldn't be needed for compiled code.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-04 Thread Inez Korczynski
Hi Tim,

I have no problem running foo.php and test.php that you sent in hphpi, but
also it compiles and run without any problems with hphp. What command
exactly do you use to compile and then execute?

Inez

On Sun, Apr 3, 2011 at 5:38 PM, Tim Starling wrote:

> On 03/29/2011 10:48 AM, Platonides wrote:
> > I was expecting this the week hip-hop hit. What would be required "to
> > target hip-hop"? How does that differ from working from Zend?
>
> I've explored the issues and made some initial changes to my working
> copy. I'm now waiting for it to compile, and once it's tested, I'll
> commit it.
>
> There is a list of things that differ here:
>
> https://github.com/facebook/hiphop-php/blob/master/doc/inconsistencies
>
> Unfortunately it seems to leave out the most important differences.
>
> It seems incredible, and I'm hoping someone will correct me, but it
> seems that file inclusion has to be completely different in HipHop.
> Even the simplest script won't work. I put this in foo.php:
>
>  class Foo {
>static function bar() {
>print "Hello\n";
>}
> }
> ?>
>
> And this in test.php:
>
>  include 'foo.php';
> Foo::bar();
> ?>
>
> This gives "HipHop Fatal error: Cannot redeclare class Foo" at
> runtime. All classes which are compiled exist from startup, and trying
> to declare them produces this error. This means that it is no longer
> possible to mix class and function declarations with code we want to
> execute. My working copy has fixes for the most important instances of
> this, such as in Setup.php and WebStart.php.
>
> There are two exceptions to this. One is the interpreter. HipHop has
> an interpreter, which is used for eval() and for include() on a file
> with a fully-qualified path. We can use this to allow us to change
> LocalSettings.php without recompiling.
>
> If you want to do include() and have it execute compiled code, you
> need to use a path which is relative to the base of the compiled code.
> My working copy has some functions which allow this to be done in a
> self-documenting way.
>
> The other exception is volatile mode, which unfortunately appears to
> be completely broken, at least in the RPMs that I'm using. It's so
> broken that calling class_exists() on a literal string will break the
> class at compile time, making it impossible to use, with no way to
> repair it. My working copy has a wrapper for class_exists() which
> doesn't suffer from this problem.
>
> Another undocumented difference is that HipHop does not use php.ini or
> anything like it, so most instances of ini_get() and ini_set() are
> broken. The functions exist, but only have stub functionality. HipHop
> has its own configuration files, but they aren't like php.ini.
>
> When I'm ready to write all this up properly, the following page will
> appear on mediawiki.org:
>
> http://www.mediawiki.org/wiki/HipHop
>
> -- Tim Starling
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-03 Thread Tim Starling
On 04/04/2011 12:11 PM, Brion Vibber wrote:
> Whee! So far it sounds like most of these are things we can work around
> reasonably sensibly, so mostly good news. Any remaining issues with 'scary
> reference stuff' like stub objects, or do those semantics actually already
> work for us?

I'm not expecting any problems with stub objects.

One piece of good news that I neglected to mention is that MediaWiki 
appears to work almost unmodified under the HipHop command-line 
interpreter, hphpi. I did some page views and edits in it. I think 
this must be what Inez is doing, judging by his very short patch. If 
there were any problems with things like references, you'd expect them 
to show up there.

Stub objects would probably break with the compiler option 
"AllDynamic" off, but so would a lot of things. That's why it's on in 
the build scripts I've written.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-03 Thread Brion Vibber
On Sun, Apr 3, 2011 at 5:38 PM, Tim Starling wrote:

> There is a list of things that differ here:
>
> https://github.com/facebook/hiphop-php/blob/master/doc/inconsistencies
>
> Unfortunately it seems to leave out the most important differences.
>

Ain't that always the way ;)

[various more scary things mentioned]


> When I'm ready to write all this up properly, the following page will
> appear on mediawiki.org:
>
> http://www.mediawiki.org/wiki/HipHop


Whee! So far it sounds like most of these are things we can work around
reasonably sensibly, so mostly good news. Any remaining issues with 'scary
reference stuff' like stub objects, or do those semantics actually already
work for us?

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-03 Thread Tim Starling
On 03/29/2011 10:48 AM, Platonides wrote:
> I was expecting this the week hip-hop hit. What would be required "to
> target hip-hop"? How does that differ from working from Zend?

I've explored the issues and made some initial changes to my working 
copy. I'm now waiting for it to compile, and once it's tested, I'll 
commit it.

There is a list of things that differ here:

https://github.com/facebook/hiphop-php/blob/master/doc/inconsistencies

Unfortunately it seems to leave out the most important differences.

It seems incredible, and I'm hoping someone will correct me, but it 
seems that file inclusion has to be completely different in HipHop. 
Even the simplest script won't work. I put this in foo.php:



And this in test.php:



This gives "HipHop Fatal error: Cannot redeclare class Foo" at 
runtime. All classes which are compiled exist from startup, and trying 
to declare them produces this error. This means that it is no longer 
possible to mix class and function declarations with code we want to 
execute. My working copy has fixes for the most important instances of 
this, such as in Setup.php and WebStart.php.

There are two exceptions to this. One is the interpreter. HipHop has 
an interpreter, which is used for eval() and for include() on a file 
with a fully-qualified path. We can use this to allow us to change 
LocalSettings.php without recompiling.

If you want to do include() and have it execute compiled code, you 
need to use a path which is relative to the base of the compiled code. 
My working copy has some functions which allow this to be done in a 
self-documenting way.

The other exception is volatile mode, which unfortunately appears to 
be completely broken, at least in the RPMs that I'm using. It's so 
broken that calling class_exists() on a literal string will break the 
class at compile time, making it impossible to use, with no way to 
repair it. My working copy has a wrapper for class_exists() which 
doesn't suffer from this problem.

Another undocumented difference is that HipHop does not use php.ini or 
anything like it, so most instances of ini_get() and ini_set() are 
broken. The functions exist, but only have stub functionality. HipHop 
has its own configuration files, but they aren't like php.ini.

When I'm ready to write all this up properly, the following page will 
appear on mediawiki.org:

http://www.mediawiki.org/wiki/HipHop

-- Tim Starling



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-31 Thread Inez Korczynski
Hello,

I'm working on migration to HipHop at Wikia (we run on MediaWiki
1.16.2 with tons of our custom extensions and skins).

At this point I'm testing and benchmarking all different use cases and
so far I didn't run into any serious problems - however there were
memory corruptions when using DOMDocument (Preprocessor_DOM) under
heavy load, but it is already fixed.

Btw. I had to apply this patch http://pastebin.com/qJNcwp99 to
MediaWiki code to make it work (commenting preg_replace is just
temporary change).

Very likely in our approach we will mostly target HipHop (not Zend)
with future development, since we want to switch our developers to
work with HipHop as well.

Inez

On Sun, Mar 27, 2011 at 8:21 PM, Tim Starling  wrote:
> I think we should migrate MediaWiki to target HipHop [1] as its
> primary high-performance platform. I think we should continue to
> support Zend, for the benefit of small installations. But we should
> additionally support HipHop, use it on Wikimedia, and optimise our
> algorithms for it.
>
> In cases where an algorithm optimised for HipHop would be excessively
> slow when running under Zend, we can split the implementations by
> subclassing.
>
> I was skeptical about HipHop at first, since the road is littered with
> the bodies of dead PHP compilers. But it looks like Facebook is pretty
> well committed to this one, and they have the resources to maintain
> it. I waited and watched for a while, but I think the time has come to
> make a decision on this.
>
> Facebook now write their PHP code to target HipHop exclusively, so by
> trying to write code that works on both platforms, we'll be in new
> territory, to some degree. Maybe that's scary, but I think it can work.
>
> Who's with me?
>
> -- Tim Starling
>
> [1] https://github.com/facebook/hiphop-php/wiki/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-30 Thread Arthur Richards
On 3/30/11 12:26 AM, Tim Starling wrote:
> On 03/30/2011 12:51 AM, Chad wrote:
>> For those of you on Ubuntu or other flavors of Debian, the guide at [0] wil
>> pretty much walk you through it pain-free. One little gotcha: you need a
>> libmemcached of at least 0.39, and the latest version in 10.04 and below
>> is 0.31, so you'll either need to do a manual build, grab it from the newer
>> repo, or go ahead and bite the bullet and upgrade. Oh, and run make from
>> a screen and walk away for awhile, it's not the fastest build ever.
> I saw that there are RPMs for CentOS, so I installed CentOS inside a
> chroot inside Ubuntu 10.10 x86-64. Surprisingly, this was quite easy.

Also, I've been told that there are VMs with HipHop already set up which 
would save the pain of compiling it yourself.  I haven't tried any of 
them myself yet, but a quick Google search led me to this:

http://www.virtcloud.eu/?page=hiphop

And there are no doubt others.

Arthur

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-30 Thread Tim Starling
On 03/30/2011 12:51 AM, Chad wrote:
> For those of you on Ubuntu or other flavors of Debian, the guide at [0] wil
> pretty much walk you through it pain-free. One little gotcha: you need a
> libmemcached of at least 0.39, and the latest version in 10.04 and below
> is 0.31, so you'll either need to do a manual build, grab it from the newer
> repo, or go ahead and bite the bullet and upgrade. Oh, and run make from
> a screen and walk away for awhile, it's not the fastest build ever.

I saw that there are RPMs for CentOS, so I installed CentOS inside a 
chroot inside Ubuntu 10.10 x86-64. Surprisingly, this was quite easy. 
I put some notes at:



Of course, the downside is that you then have to work inside a chroot. 
It's probably tolerable if you use the bind mount for /home that 
schroot provides by default to store your files.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-29 Thread Magnus Manske
On Tue, Mar 29, 2011 at 3:28 PM, Aryeh Gregor
 wrote:
> On Mon, Mar 28, 2011 at 9:33 PM, Tim Starling  wrote:
>> Yes, that's true, and that's part of the reason I'm flagging this
>> change on the mailing list. Domas says that the HipHop team is working
>> on PHP 5.3 support, so maybe the issue won't come up. But yes, in
>> principle, I am saying that we should support HipHop even when it
>> means not using new features from PHP.
>>
>> PHP 5.3 might be cool, but so is cutting our power usage by half (pun
>> intended).
>
> Okay, then I'm all in favor.

Plus, free C++ MediaWiki parser ;-)

Seriously, there should be a way to turn the entire package into a
(huge) library; maybe transpile it and then replace the C++ code for
index.php with a manually written library interface?

Offline readers, scientific analysis tools, etc. could profit
massively from an always-current, fast C++ library...

Magnus

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-29 Thread Aryeh Gregor
On Mon, Mar 28, 2011 at 9:33 PM, Tim Starling  wrote:
> Yes, that's true, and that's part of the reason I'm flagging this
> change on the mailing list. Domas says that the HipHop team is working
> on PHP 5.3 support, so maybe the issue won't come up. But yes, in
> principle, I am saying that we should support HipHop even when it
> means not using new features from PHP.
>
> PHP 5.3 might be cool, but so is cutting our power usage by half (pun
> intended).

Okay, then I'm all in favor.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-29 Thread Chad
On Sun, Mar 27, 2011 at 11:21 PM, Tim Starling  wrote:
> Facebook now write their PHP code to target HipHop exclusively, so by
> trying to write code that works on both platforms, we'll be in new
> territory, to some degree. Maybe that's scary, but I think it can work.
>
> Who's with me?
>

*grabs a battle axe* I'm with you!

I went ahead and compiled hiphop last night on a fresh VM. Couple of
notes for anyone trying to join us.

For those of you on Ubuntu or other flavors of Debian, the guide at [0] wil
pretty much walk you through it pain-free. One little gotcha: you need a
libmemcached of at least 0.39, and the latest version in 10.04 and below
is 0.31, so you'll either need to do a manual build, grab it from the newer
repo, or go ahead and bite the bullet and upgrade. Oh, and run make from
a screen and walk away for awhile, it's not the fastest build ever.

I finished building around 1am last night, didn't get to the next stage yet.

I might try building on OSX today. I couldn't get it to work ~6 months ago,
but those issues may well be resolved by now.

-Chad

[0] 
https://github.com/facebook/hiphop-php/wiki/Building-and-Installing-on-Ubuntu-10.10

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Tim Starling
On 29/03/11 09:40, Aryeh Gregor wrote:
> On Mon, Mar 28, 2011 at 10:47 AM, Tim Starling  
> wrote:
>> We can use features from both, using function_exists(), like what we
>> do now with PHP modules.
> 
> Well, yes, if there's some reasonable fallback.  It doesn't work for
> features that are useless if you have to write a fallback, like
> various types of syntactic sugar.  For example, the first features
> from PHP 5.3 release notes include namespaces, late static binding,
> lambda functions and closures, NOWDOC, a ternary operator shortcut,
> limited goto, and __callStatic.  If Facebook didn't implement some of
> those new features in Hiphop by the time we could feasibly require PHP
> 5.3, we wouldn't be able to use them.  (Some look really nice, like
> anonymous functions -- one of the things I really like about
> JavaScript.)

Yes, that's true, and that's part of the reason I'm flagging this
change on the mailing list. Domas says that the HipHop team is working
on PHP 5.3 support, so maybe the issue won't come up. But yes, in
principle, I am saying that we should support HipHop even when it
means not using new features from PHP.

PHP 5.3 might be cool, but so is cutting our power usage by half (pun
intended).

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Platonides
Tim Starling wrote:
> I think we should migrate MediaWiki to target HipHop [1] as its
> primary high-performance platform. I think we should continue to
> support Zend, for the benefit of small installations. But we should
> additionally support HipHop, use it on Wikimedia, and optimise our
> algorithms for it.
> 
> In cases where an algorithm optimised for HipHop would be excessively
> slow when running under Zend, we can split the implementations by
> subclassing.
> 
> I was skeptical about HipHop at first, since the road is littered with
> the bodies of dead PHP compilers. But it looks like Facebook is pretty
> well committed to this one, and they have the resources to maintain
> it. I waited and watched for a while, but I think the time has come to
> make a decision on this.
> 
> Facebook now write their PHP code to target HipHop exclusively, so by
> trying to write code that works on both platforms, we'll be in new
> territory, to some degree. Maybe that's scary, but I think it can work.
> 
> Who's with me?
> 
> -- Tim Starling
> 
> [1] https://github.com/facebook/hiphop-php/wiki/

I was expecting this the week hip-hop hit. What would be required "to
target hip-hop"? How does that differ from working from Zend?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Aryeh Gregor
On Mon, Mar 28, 2011 at 10:47 AM, Tim Starling  wrote:
> We can use features from both, using function_exists(), like what we
> do now with PHP modules.

Well, yes, if there's some reasonable fallback.  It doesn't work for
features that are useless if you have to write a fallback, like
various types of syntactic sugar.  For example, the first features
from PHP 5.3 release notes include namespaces, late static binding,
lambda functions and closures, NOWDOC, a ternary operator shortcut,
limited goto, and __callStatic.  If Facebook didn't implement some of
those new features in Hiphop by the time we could feasibly require PHP
5.3, we wouldn't be able to use them.  (Some look really nice, like
anonymous functions -- one of the things I really like about
JavaScript.)

Granted, this sort of thing is rarely very essential, and maybe Hiphop
will keep up with all of PHP's new syntactic sugar.  Overall, I'm all
in favor of trying out Hiphop on Wikimedia -- I was just wondering
what would happen if Hiphop doesn't incorporate all of PHP's new
features over time.  Which might be groundless, if Facebook plans to
incorporate all of PHP's new syntactic features over time.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Daniel Friesen
On 11-03-28 12:44 AM, Tim Starling wrote:
> On 28/03/11 17:36, Roan Kattouw wrote:
>> 2011/3/28 Tim Starling:
>>> Who's with me?
>>>
>> I don't really have a good idea of what would need to change to
>> support HipHop, but if the changes aren't to intrusive I'm all for it.
>>
>> If we decide to do this, we should also decide when to start and when
>> we want to have HPHP support working (1.18? 1.19?).
> It depends on how many people are interested in it, and I'm not sure
> how much work there is to do. But as long as we're careful to maintain
> compatibility with Zend, we can work in trunk. Once it's ready, we can
> add it to the installation docs.
>
> It should be ready for 1.19 at the latest. If it's not done by then,
> we should shelve the project.
>
> -- Tim Starling
Sounds interesting...

Then again, I'm also interested in making Drizzle work, and switching 
our skin systems to using a custom xml/html template system.

Maybe I'll try running HPHP myself in production in my upcoming project 
when it's ready in core.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Chad
On Mon, Mar 28, 2011 at 10:28 AM, Aryeh Gregor
 wrote:
> On Mon, Mar 28, 2011 at 9:42 AM, Chad  wrote:
>> I also don't know if they've actually merged the 32bit work into their
>> mainline yet--I know a volunteer was working on it. If they're lacking
>> 32bit support in the main release still, that might be a reason to hold
>> off for now.
>
> Why?  People on 32-bit machines can just run Zend PHP.
>

I meant that as more for developers looking to help in the effort but
might still be on a 32bit system :)

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HipHop

2011-03-28 Thread Tim Starling
On 29/03/11 01:28, Aryeh Gregor wrote:
> What happens when the feature lists start diverging, because Zend adds
> what it thinks would be useful and Facebook ignores that and adds what
> it thinks would be useful?  Then we can't use any new features from
> either.

We can use features from both, using function_exists(), like what we
do now with PHP modules.

If you compile PHP with no zlib, you can't compress anything, but the
rest of MediaWiki still works. In the future we may use HipHop's
parallel execution features. If you don't have HipHop, the work will
be done in serial. I quandaries will be very rare.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Domas Mituzas

On Mar 28, 2011, at 5:28 PM, Aryeh Gregor wrote:

> ... and Facebook ignores that and adds what
> it thinks would be useful? 

Facebook already has features Zend does not:

https://github.com/facebook/hiphop-php/blob/master/doc/extension.new_functions

Stuff like:
* Parallel RPC - MySQL, HTTP, ..
* Background execution, post-send execution, pagelet server
etc

Domas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Aryeh Gregor
On Sun, Mar 27, 2011 at 11:21 PM, Tim Starling  wrote:
> Facebook now write their PHP code to target HipHop exclusively, so by
> trying to write code that works on both platforms, we'll be in new
> territory, to some degree. Maybe that's scary, but I think it can work.

What happens when the feature lists start diverging, because Zend adds
what it thinks would be useful and Facebook ignores that and adds what
it thinks would be useful?  Then we can't use any new features from
either.  Or are we sure Facebook is committed to maintaining long-term
compatibility with Zend PHP?

On Mon, Mar 28, 2011 at 9:42 AM, Chad  wrote:
> I also don't know if they've actually merged the 32bit work into their
> mainline yet--I know a volunteer was working on it. If they're lacking
> 32bit support in the main release still, that might be a reason to hold
> off for now.

Why?  People on 32-bit machines can just run Zend PHP.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Chad
On Mon, Mar 28, 2011 at 2:50 AM, Aaron Schulz  wrote:
> (ii) Also, it would be nice if developers could all have hiphop running on
> their test wikis, so that code that's broken on hiphop isn't committed in
> ignorance. The only problem is that, last time I checked, the dependency
> list for hiphop is very considerable.
>

I also don't know if they've actually merged the 32bit work into their
mainline yet--I know a volunteer was working on it. If they're lacking
32bit support in the main release still, that might be a reason to hold
off for now.

I've compiled HPHP before, the dependencies aren't really that bad
(anymore), you just have to compile a custom build of libevent and
libcurl.

I know nothing of trying to get it to work on Windows, probably would
be a royal PITA without cygwin.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-28 Thread Tim Starling
On 28/03/11 17:36, Roan Kattouw wrote:
> 2011/3/28 Tim Starling :
>> Who's with me?
>>
> I don't really have a good idea of what would need to change to
> support HipHop, but if the changes aren't to intrusive I'm all for it.
> 
> If we decide to do this, we should also decide when to start and when
> we want to have HPHP support working (1.18? 1.19?).

It depends on how many people are interested in it, and I'm not sure
how much work there is to do. But as long as we're careful to maintain
compatibility with Zend, we can work in trunk. Once it's ready, we can
add it to the installation docs.

It should be ready for 1.19 at the latest. If it's not done by then,
we should shelve the project.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-27 Thread Aaron Schulz

Two things:
(i) I'd really hope that subclassing would be very rare here. I don't think
this will be much of an issue though.
(ii) Also, it would be nice if developers could all have hiphop running on
their test wikis, so that code that's broken on hiphop isn't committed in
ignorance. The only problem is that, last time I checked, the dependency
list for hiphop is very considerable...and isn't for Windows yet. However, I
believe Domas didn't need *too* many patches to get MW working, which
suggests that having to write code that compiles with hiphop won't be that
difficult and error prone. If there can be a small yet complete list of
"things that only work in regular PHP" then that might be an OK alternative
to each dev running/testing hiphop.

Otherwise,


Tim Starling-2 wrote:
> 
> I think we should migrate MediaWiki to target HipHop [1] as its
> primary high-performance platform. I think we should continue to
> support Zend, for the benefit of small installations. But we should
> additionally support HipHop, use it on Wikimedia, and optimise our
> algorithms for it.
> 
> In cases where an algorithm optimised for HipHop would be excessively
> slow when running under Zend, we can split the implementations by
> subclassing.
> 
> I was skeptical about HipHop at first, since the road is littered with
> the bodies of dead PHP compilers. But it looks like Facebook is pretty
> well committed to this one, and they have the resources to maintain
> it. I waited and watched for a while, but I think the time has come to
> make a decision on this.
> 
> Facebook now write their PHP code to target HipHop exclusively, so by
> trying to write code that works on both platforms, we'll be in new
> territory, to some degree. Maybe that's scary, but I think it can work.
> 
> Who's with me?
> 
> -- Tim Starling
> 
> [1] https://github.com/facebook/hiphop-php/wiki/
> 
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 
> 

-- 
View this message in context: 
http://old.nabble.com/HipHop-tp31253551p31254438.html
Sent from the Wikipedia Developers mailing list archive at Nabble.com.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-03-27 Thread Roan Kattouw
2011/3/28 Tim Starling :
> Who's with me?
>
I don't really have a good idea of what would need to change to
support HipHop, but if the changes aren't to intrusive I'm all for it.

If we decide to do this, we should also decide when to start and when
we want to have HPHP support working (1.18? 1.19?).

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] HipHop

2011-03-27 Thread Tim Starling
I think we should migrate MediaWiki to target HipHop [1] as its
primary high-performance platform. I think we should continue to
support Zend, for the benefit of small installations. But we should
additionally support HipHop, use it on Wikimedia, and optimise our
algorithms for it.

In cases where an algorithm optimised for HipHop would be excessively
slow when running under Zend, we can split the implementations by
subclassing.

I was skeptical about HipHop at first, since the road is littered with
the bodies of dead PHP compilers. But it looks like Facebook is pretty
well committed to this one, and they have the resources to maintain
it. I waited and watched for a while, but I think the time has come to
make a decision on this.

Facebook now write their PHP code to target HipHop exclusively, so by
trying to write code that works on both platforms, we'll be in new
territory, to some degree. Maybe that's scary, but I think it can work.

Who's with me?

-- Tim Starling

[1] https://github.com/facebook/hiphop-php/wiki/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress?

2010-03-29 Thread Ryan Biesemeyer
Thank you, Ævar. I've requested access as you suggested.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] hiphop progress?

2010-03-26 Thread Ævar Arnfjörð Bjarmason
On Fri, Mar 26, 2010 at 20:00, Ryan Bies  wrote:
> [...]

I don't know the answer to your question but can we please get this
hiphop work into SVN and out of various patchsets floating around?
Let's just create a hiphop branch for it so we can all experiment with
it.

It looks like you don't have SVN commit access. Please ask for an
account to commit this stuff.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] hiphop progress?

2010-03-26 Thread Ryan Bies
Hello all.

For those of you who have tried to get mediawiki running under hiphop, I've
run into a block.

After applying the patches from Domas and converting a few e-flagged
preg_replace functions to preg_replace_callback, I have it running
successfully, hitting the database as many times as it needs to, invoking no
errors, and generally being very fast. I get a 200 response on my requests
(very quickly), no errors raised, but no matter the request, the body is
completely blank and the content-length header is 3. I suspect that the skin
is not getting loaded properly, but as I'm not deeply familiar with the
codebase I'm wondering if any of you can see anything obviously wrong. This
happens on api requests and requests to the index after a 301 to MainPage.

Thanks for your help!

Ryan

I have applied the following patch from Domas against svn rev 63062:
http://stafford.wikimedia.org/current-patch.txt

I replaced a couple e-flagged preg_replace functions with
preg_replace_callback and did some type interference:
http://projects.yaauie.com/hphp/mediawiki/hphp-yaauie.patch

I'm using the following file list:
http://projects.yaauie.com/hphp/mediawiki/hphp-files.list

My environment variables are set by source-ing this file:
http://projects.yaauie.com/hphp/mediawiki/hphp.env

I'm compiling with the following string:
$HPHP --input-list=files.list --force=1 --k 1 --log=3 --program=mediawiki

I'm running the resulting application with the following string:
sudo /tmp/hphp_abc123/mediawiki -m server -c hphp-runtime-config.hdf

The config file I'm loading is:
http://projects.yaauie.com/hphp/mediawiki/hphp-runtime-config.hdf
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-03 Thread Tim Starling
Domas Mituzas wrote:
> Jared,
> 
>> assert(hash('adler32', 'foo', true) === mhash(MHASH_ADLER32, 'foo'));
> 
> Thanks! Would get to that eventually, I guess. Still, there's xdiff and few 
> other things.

xdiff is only needed for recompression. For page views, there is a
pure-PHP port of the "patch" part.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-03 Thread Domas Mituzas
Jared,

> assert(hash('adler32', 'foo', true) === mhash(MHASH_ADLER32, 'foo'));

Thanks! Would get to that eventually, I guess. Still, there's xdiff and few 
other things.

Domas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-03 Thread Jared Williams
 

> -Original Message-
> From: wikitech-l-boun...@lists.wikimedia.org 
> [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of 
> Domas Mituzas
> Sent: 01 March 2010 10:11
> To: Wikimedia developers
> Subject: [Wikitech-l] hiphop progress
> 
> Howdy,
> 
> > Most of the code in MediaWiki works just fine with it 
> (since most of 
> > it is mundane) but things like dynamically including certain
files, 
> > declaring classes, eval() and so on are all out.
> 
> There're two types of includes in MediaWiki, ones I fixed for 
> AutoLoader and ones I didn't - HPHP has all classes loaded, 
> so AutoLoader is redundant. 
> Generally, every include that just defines classes/functions 
> is fine with HPHP, it is just some of MediaWiki's startup 
> logic (Setup/WebStart) that depends on files included in 
> certain order, so we have to make sure HipHop understands 
> those includes.
> There was some different behavior with file including - in 
> Zend you can say require("File.php"), and it will try current 
> script's directory, but if you do require("../File.php") - it will 
> 
> We don't have any eval() at the moment, and actually there's 
> a mode when eval() works, people are just scared too much of it. 
> We had some double class definitions (depending on whether 
> certain components are available), as well as double function 
> definitions ( ProfilerStub vs Profiler )
> 
> One of major problems is simply still not complete function 
> set, that we'd need:
> 
> * session - though we could sure work around it by setting up 
> our own Session abstraction, team at facebook is already busy 
> implementing full support
> * xdiff, mhash - the only two calls to it are from 
> DiffHistoryBlob - so getting the feature to work is mandatory 
> for production, not needed for testing :)

Mhash been obsoleted by the hash extension, and HipHop has the hash
extension (looking at the src).

I think mhash is implemented as a wrapper onto the hash extension for
a while. (http://svn.php.net/viewvc?view=revision&revision=269961)

assert(hash('adler32', 'foo', true) === mhash(MHASH_ADLER32, 'foo'));

Jared


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Marco Schuster
On Mon, Mar 1, 2010 at 3:26 PM, Daniel Kinzler  wrote:
> Marco Schuster schrieb:
>> The point of $IP is that you can use multisite environments by just
>> having index.php and Localsettings.php (and skin crap) in the
>> per-vhost directory, and have extensions and other stuff centralized
>> so you can update the extension once and all the wikis automatically
>> have it.
>
> That'S a silly multi-host setup. Much easier to have a single copy of
> everything, and just use conditionals in localsettings, based on hostname or 
> path.
Downside of this: as a provider, *you* must make the change, not the
customer, as it is one central file.

Marco
-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Jared Williams
 

> -Original Message-
> From: wikitech-l-boun...@lists.wikimedia.org 
> [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of 
> Ævar Arnfjörð Bjarmason
> Sent: 01 March 2010 13:34
> To: Wikimedia developers
> Subject: Re: [Wikitech-l] hiphop progress
> 
> On Mon, Mar 1, 2010 at 10:10, Domas Mituzas 
>  wrote:
> > Howdy,
> >
> >> Most of the code in MediaWiki works just fine with it 
> (since most of 
> >> it is mundane) but things like dynamically including 
> certain files, 
> >> declaring classes, eval() and so on are all out.
> >
> > There're two types of includes in MediaWiki, ones I fixed 
> for AutoLoader and ones I didn't - HPHP has all classes 
> loaded, so AutoLoader is redundant.
> > Generally, every include that just defines 
> classes/functions is fine with HPHP, it is just some of 
> MediaWiki's startup logic (Setup/WebStart) that depends on 
> files included in certain order, so we have to make sure 
> HipHop understands those includes.
> > There was some different behavior with file including - in Zend
you 
> > can say require("File.php"), and it will try current script's 
> > directory, but if you do require("../File.php") - it will
> >
> > We don't have any eval() at the moment, and actually 
> there's a mode when eval() works, people are just scared too 
> much of it.
> > We had some double class definitions (depending on whether certain

> > components are available), as well as double function definitions
( 
> > ProfilerStub vs Profiler )
> >
> > One of major problems is simply still not complete function 
> set, that we'd need:
> >
> > * session - though we could sure work around it by setting 
> up our own 
> > Session abstraction, team at facebook is already busy implementing

> > full support
> > * xdiff, mhash - the only two calls to it are from 
> DiffHistoryBlob - 
> > so getting the feature to work is mandatory for production, 
> not needed 
> > for testing :)
> > * tidy - have to call the binary now
> >
> > function_exists() is somewhat crippled, as far as I 
> understand, so I had to work around certain issues there.
> > There're some other crippled functions, which we hit 
> through the testing...
> >
> > It is quite fun to hit all the various edge cases in PHP 
> language (e.g. interfaces may have constants) which are 
> broken in hiphop.
> > Good thing is having developers carefully reading/looking 
> at those. Some things are still broken, some can be worked 
> around in MediaWiki.
> >
> > Some of crashes I hit are quite difficult to reproduce - it 
> is easier to bypass that code for now, and come up with good 
> reproduction cases later.
> >
> >> Even if it wasn't hotspots like the parser could still be
compiled 
> >> with hiphop and turned into a PECL extension.
> >
> > hiphop provides major boost for actual mediawiki 
> initialization too - while Zend has to reinitialize objects 
> and data all the time, having all that in core process image 
> is quite efficient.
> >
> >> One other nice thing about hiphop is that the compiler output is 
> >> relatively readable compared to most compilers. Meaning that if
you
> >
> > That especially helps with debugging :)
> >
> >> need to optimize some particular function it's easy to take the 
> >> generated .cpp output and replace the generated code with 
> something 
> >> more native to C++ that doesn't lose speed because it needs to 
> >> manipulate everything as a php object.
> >
> > Well, that is not entirely true - if it manipulated 
> everything as PHP object (zval), it would be as slow and 
> inefficient as PHP. The major cost benefit here is that it 
> does strict type inference, and falls back to Variant only 
> when it cannot come up with decent type.
> > And yes, one can find offending code that causes the 
> expensive paths. I don't see manual C++ code optimizations as 
> way to go though - because they'd be overwritten by next code build.
> 
> The case I had in mind is when you have say a function in the 
> parser that takes a $string and munges it. If that turns out 
> to be a bottleneck you could just get a char* out of that 
> $string and munge it at the C level instead of calling the 
> PHP wrappers for things like
> explode() and other php string/array munging.
> 
> That's some future project once it's working and those 
> bottlenecks are found though, I was just pleasantly surprised 
> that hphp makes this relatively easy.
> 

I would think that getting hiphop to compile out regular expressions
from preg_*() calls to C++ (like re2c), would be the idea.

Jared


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Daniel Kinzler
Marco Schuster schrieb:
> The point of $IP is that you can use multisite environments by just
> having index.php and Localsettings.php (and skin crap) in the
> per-vhost directory, and have extensions and other stuff centralized
> so you can update the extension once and all the wikis automatically
> have it.

That'S a silly multi-host setup. Much easier to have a single copy of
everything, and just use conditionals in localsettings, based on hostname or 
path.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Marco Schuster
The point of $IP is that you can use multisite environments by just
having index.php and Localsettings.php (and skin crap) in the
per-vhost directory, and have extensions and other stuff centralized
so you can update the extension once and all the wikis automatically
have it.
However, the Installer could be patched, to resolve $IP automatically
if the user wishes to run a HipHop environment.

Marco

On Mon, Mar 1, 2010 at 2:59 PM, Ævar Arnfjörð Bjarmason
 wrote:
> On Mon, Mar 1, 2010 at 13:35, Domas Mituzas  wrote:
>> Still, the decision to merge certain changes into MediaWiki codebase (e.g. 
>> relative includes, rather than $IP-based absolute ones) would be quite 
>> invasive.
>> Also, we'd have to enforce stricter policy on how some of the dynamic PHP 
>> features are used.
>
> I might be revealing my lack of knowledge about PHP here but why is
> that invasive and why do we use $IP in includes in the first place? I
> did some tests here:
>
>    http://gist.github.com/310380
>
> Which show that as long as you set_include_path() with $IP/includes/
> at the front PHP will make exactly the same stat(), read() etc. calls
> with relative paths that it does with absolute paths.
>
> Maybe that's only on recent versions, I tested on php 5.2.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Ævar Arnfjörð Bjarmason
On Mon, Mar 1, 2010 at 13:35, Domas Mituzas  wrote:
> Still, the decision to merge certain changes into MediaWiki codebase (e.g. 
> relative includes, rather than $IP-based absolute ones) would be quite 
> invasive.
> Also, we'd have to enforce stricter policy on how some of the dynamic PHP 
> features are used.

I might be revealing my lack of knowledge about PHP here but why is
that invasive and why do we use $IP in includes in the first place? I
did some tests here:

http://gist.github.com/310380

Which show that as long as you set_include_path() with $IP/includes/
at the front PHP will make exactly the same stat(), read() etc. calls
with relative paths that it does with absolute paths.

Maybe that's only on recent versions, I tested on php 5.2.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Domas Mituzas
Howdy,

> Looks like a loot of fun :-)

Fun enough to have my evenings and weekends on it :) 

> this smell like something that can benefict from metadata.
> /* [return  integer] */  function getApparatusId($obj){
>  //body
> }

Indeed - type hints can be quite useful, though hiphop is smart enough to 
figure out it will be an integer return from code :)

It is quite interesting to see the enhancements to PHP that have been inside 
facebook and now are all released - XHP evolves PHP syntax to fit the web world 
( 
http://www.facebook.com/notes/facebook-engineering/xhp-a-new-way-to-write-php/294003943919
 ), the XBOX thing allows background/async execution of work without standing 
in the way of page rendering, etc. 

> What we can expect?  will future versions of MediaWiki be "hiphop
> compatible"? there will be a fork or snapshot compatible?  The whole
> experiment looks like will help to profile and enhance the engine,
> will it generate a MediaWiki.tar.gz  file we (the users) will able to
> install in our intranetss ??

Well, the build itself is quite portable (you'd have to have single binary and 
LocalSettings.php ;-) 

Still, the decision to merge certain changes into MediaWiki codebase (e.g. 
relative includes, rather than $IP-based absolute ones) would be quite 
invasive. 
Also, we'd have to enforce stricter policy on how some of the dynamic PHP 
features are used. 

I have to deal here with three teams (wikimedia ops, mediawiki development 
community and hiphop developers) to make stuff possible. 
Do note, getting it work for MediaWiki is quite simple task, compared to 
getting it work in Wikimedia operations environment. 

What I'd like to see though as final result - MediaWiki that works fine with 
both Zend and HPHP, and Wikimedia using the latter. 
Unfortunately, I will not be able to visit Berlin developer meeting to present 
this work to other developers and will try to get some separate discussions. 
You know, most of work will be coming up with solutions that are acceptable by 
Tim :-) 

> Maybe a blog article about your findings could be nice. It may help
> "write fast PHP code". And will scare littel childrens and PHP
> programmers with a C++ background.

My findings are hectic, at the moment, and I don't want to talk too much about 
them, until I get decently working mediawiki.
BTW, Main_Page and Special:BlankPage were both served in ~12ms. Now I have to 
get complex parser test cases work, and such.

Domas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Ævar Arnfjörð Bjarmason
On Mon, Mar 1, 2010 at 10:10, Domas Mituzas  wrote:
> Howdy,
>
>> Most of the code in MediaWiki works just fine with it (since most of
>> it is mundane) but things like dynamically including certain files,
>> declaring classes, eval() and so on are all out.
>
> There're two types of includes in MediaWiki, ones I fixed for AutoLoader and 
> ones I didn't - HPHP has all classes loaded, so AutoLoader is redundant.
> Generally, every include that just defines classes/functions is fine with 
> HPHP, it is just some of MediaWiki's startup logic (Setup/WebStart) that 
> depends on files included in certain order, so we have to make sure HipHop 
> understands those includes.
> There was some different behavior with file including - in Zend you can say 
> require("File.php"), and it will try current script's directory, but if you 
> do require("../File.php") - it will
>
> We don't have any eval() at the moment, and actually there's a mode when 
> eval() works, people are just scared too much of it.
> We had some double class definitions (depending on whether certain components 
> are available), as well as double function definitions ( ProfilerStub vs 
> Profiler )
>
> One of major problems is simply still not complete function set, that we'd 
> need:
>
> * session - though we could sure work around it by setting up our own Session 
> abstraction, team at facebook is already busy implementing full support
> * xdiff, mhash - the only two calls to it are from DiffHistoryBlob - so 
> getting the feature to work is mandatory for production, not needed for 
> testing :)
> * tidy - have to call the binary now
>
> function_exists() is somewhat crippled, as far as I understand, so I had to 
> work around certain issues there.
> There're some other crippled functions, which we hit through the testing...
>
> It is quite fun to hit all the various edge cases in PHP language (e.g. 
> interfaces may have constants) which are broken in hiphop.
> Good thing is having developers carefully reading/looking at those. Some 
> things are still broken, some can be worked around in MediaWiki.
>
> Some of crashes I hit are quite difficult to reproduce - it is easier to 
> bypass that code for now, and come up with good reproduction cases later.
>
>> Even if it wasn't hotspots like the parser could still be compiled
>> with hiphop and turned into a PECL extension.
>
> hiphop provides major boost for actual mediawiki initialization too - while 
> Zend has to reinitialize objects and data all the time, having all that in 
> core process image is quite efficient.
>
>> One other nice thing about hiphop is that the compiler output is
>> relatively readable compared to most compilers. Meaning that if you
>
> That especially helps with debugging :)
>
>> need to optimize some particular function it's easy to take the
>> generated .cpp output and replace the generated code with something
>> more native to C++ that doesn't lose speed because it needs to
>> manipulate everything as a php object.
>
> Well, that is not entirely true - if it manipulated everything as PHP object 
> (zval), it would be as slow and inefficient as PHP. The major cost benefit 
> here is that it does strict type inference, and falls back to Variant only 
> when it cannot come up with decent type.
> And yes, one can find offending code that causes the expensive paths. I don't 
> see manual C++ code optimizations as way to go though - because they'd be 
> overwritten by next code build.

The case I had in mind is when you have say a function in the parser
that takes a $string and munges it. If that turns out to be a
bottleneck you could just get a char* out of that $string and munge it
at the C level instead of calling the PHP wrappers for things like
explode() and other php string/array munging.

That's some future project once it's working and those bottlenecks are
found though, I was just pleasantly surprised that hphp makes this
relatively easy.

One large practical upshot of this is though that hacky things like
the parser which are the way they are because that's how you optimize
this sort of thing in PHP could be written in some babytalk version of
PHP that produces a real parse tree; It would be slower in pure php
but maybe hphp's speed could make up for it.

Then you could take that component & compile it to C++ (maybe with
some manual munging) and make libmediawiki-parse++ which, that would
be quite awesome :)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop progress

2010-03-01 Thread Tei
Looks like a loot of fun :-)

On 1 March 2010 11:10, Domas Mituzas  wrote:
...
>> Even if it wasn't hotspots like the parser could still be compiled
>> with hiphop and turned into a PECL extension.
>
> hiphop provides major boost for actual mediawiki initialization too - while 
> Zend has to reinitialize objects and data all the time, having all that in 
> core process image is quite efficient.
>
>> One other nice thing about hiphop is that the compiler output is
>> relatively readable compared to most compilers. Meaning that if you
>
> That especially helps with debugging :)
>
>> need to optimize some particular function it's easy to take the
>> generated .cpp output and replace the generated code with something
>> more native to C++ that doesn't lose speed because it needs to
>> manipulate everything as a php object.
>
> Well, that is not entirely true - if it manipulated everything as PHP object 
> (zval), it would be as slow and inefficient as PHP. The major cost benefit 
> here is that it does strict type inference, and falls back to Variant only 
> when it cannot come up with decent type.
> And yes, one can find offending code that causes the expensive paths. I don't 
> see manual C++ code optimizations as way to go though - because they'd be 
> overwritten by next code build.
>

this smell like something that can benefict from metadata.

/* [return  integer] */  function getApparatusId($obj){
  //body
}

 - - -

User question follows:

What we can expect?  will future versions of MediaWiki be "hiphop
compatible"? there will be a fork or snapshot compatible?  The whole
experiment looks like will help to profile and enhance the engine,
will it generate a MediaWiki.tar.gz  file we (the users) will able to
install in our intranetss ??

Maybe a blog article about your findings could be nice. It may help
"write fast PHP code". And will scare littel childrens and PHP
programmers with a C++ background.



--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] hiphop progress

2010-03-01 Thread Domas Mituzas
Howdy,

> Most of the code in MediaWiki works just fine with it (since most of
> it is mundane) but things like dynamically including certain files,
> declaring classes, eval() and so on are all out.

There're two types of includes in MediaWiki, ones I fixed for AutoLoader and 
ones I didn't - HPHP has all classes loaded, so AutoLoader is redundant. 
Generally, every include that just defines classes/functions is fine with HPHP, 
it is just some of MediaWiki's startup logic (Setup/WebStart) that depends on 
files included in certain order, so we have to make sure HipHop understands 
those includes.
There was some different behavior with file including - in Zend you can say 
require("File.php"), and it will try current script's directory, but if you do 
require("../File.php") - it will 

We don't have any eval() at the moment, and actually there's a mode when eval() 
works, people are just scared too much of it. 
We had some double class definitions (depending on whether certain components 
are available), as well as double function definitions ( ProfilerStub vs 
Profiler )

One of major problems is simply still not complete function set, that we'd need:

* session - though we could sure work around it by setting up our own Session 
abstraction, team at facebook is already busy implementing full support
* xdiff, mhash - the only two calls to it are from DiffHistoryBlob - so getting 
the feature to work is mandatory for production, not needed for testing :) 
* tidy - have to call the binary now

function_exists() is somewhat crippled, as far as I understand, so I had to 
work around certain issues there.
There're some other crippled functions, which we hit through the testing... 

It is quite fun to hit all the various edge cases in PHP language (e.g. 
interfaces may have constants) which are broken in hiphop. 
Good thing is having developers carefully reading/looking at those. Some things 
are still broken, some can be worked around in MediaWiki. 

Some of crashes I hit are quite difficult to reproduce - it is easier to bypass 
that code for now, and come up with good reproduction cases later. 

> Even if it wasn't hotspots like the parser could still be compiled
> with hiphop and turned into a PECL extension.

hiphop provides major boost for actual mediawiki initialization too - while 
Zend has to reinitialize objects and data all the time, having all that in core 
process image is quite efficient. 

> One other nice thing about hiphop is that the compiler output is
> relatively readable compared to most compilers. Meaning that if you

That especially helps with debugging :) 

> need to optimize some particular function it's easy to take the
> generated .cpp output and replace the generated code with something
> more native to C++ that doesn't lose speed because it needs to
> manipulate everything as a php object.

Well, that is not entirely true - if it manipulated everything as PHP object 
(zval), it would be as slow and inefficient as PHP. The major cost benefit here 
is that it does strict type inference, and falls back to Variant only when it 
cannot come up with decent type. 
And yes, one can find offending code that causes the expensive paths. I don't 
see manual C++ code optimizations as way to go though - because they'd be 
overwritten by next code build.

Anyway, there're lots of interesting problems after we get mediawiki working on 
it - that is, how would we deploy it, how would we maintain it, etc.
Building on single box takes around 10 minutes, and the image has to be 
replaced by shutting down old one and starting new one, not just overwriting 
the files. 

Domas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-28 Thread Ævar Arnfjörð Bjarmason
On Sun, Feb 28, 2010 at 21:39, David Gerard  wrote:
> On 28 February 2010 21:33, Domas Mituzas  wrote:
>
>> these numbers seriously kick ass. I still can't believe I observe 2000 
>> mediawiki requests/s from a single box ;-)
>
>
> So ... how restricted is HipHop PHP, and what are the hotspots in
> MediaWiki that would most benefit from it?

 Most of the code in MediaWiki works just fine with it (since most of
it is mundane) but things like dynamically including certain files,
declaring classes, eval() and so on are all out.

It should be possible to replace all that at the cost of code that's a
bit more verbose.

Even if it wasn't hotspots like the parser could still be compiled
with hiphop and turned into a PECL extension.

One other nice thing about hiphop is that the compiler output is
relatively readable compared to most compilers. Meaning that if you
need to optimize some particular function it's easy to take the
generated .cpp output and replace the generated code with something
more native to C++ that doesn't lose speed because it needs to
manipulate everything as a php object.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-28 Thread Ævar Arnfjörð Bjarmason
On Sun, Feb 28, 2010 at 21:33, Domas Mituzas  wrote:
>>
>> Nevertheless - a process isn't the same process when it's going at 10x
>> the speed. This'll be interesting.
>
> not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on 
> modern boxes, and saw:
>
> HipHop: Requests per second:    1975.39 [#/sec] (mean)
> Zend: Requests per second:    371.29 [#/sec] (mean)
>
> these numbers seriously kick ass. I still can't believe I observe 2000 
> mediawiki requests/s from a single box ;-)

Awesome. I did some tryouts with hiphop too before you started overtaking me.

Is this work on SVN yet? Maybe it would be nice to create a branch for
it so that other people can poke it?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] hiphop! :)

2010-02-28 Thread William Pietri
On 02/28/2010 01:33 PM, Domas Mituzas wrote:
>
> not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on 
> modern boxes, and saw:
>
> HipHop: Requests per second:1975.39 [#/sec] (mean)
> Zend: Requests per second:371.29 [#/sec] (mean)
>
> these numbers seriously kick ass. I still can't believe I observe 2000 
> mediawiki requests/s from a single box ;-)
>

Bravo! That's fantastic. Thanks for both the work and the testing.

William

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-28 Thread Anthony
On Sun, Feb 28, 2010 at 4:33 PM, Domas Mituzas wrote:

> >
> > Nevertheless - a process isn't the same process when it's going at 10x
> > the speed. This'll be interesting.
>
> not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on
> modern boxes, and saw:
>
> HipHop: Requests per second:1975.39 [#/sec] (mean)
> Zend: Requests per second:371.29 [#/sec] (mean)
>
> these numbers seriously kick ass. I still can't believe I observe 2000
> mediawiki requests/s from a single box ;-)
>

Great job Domas.  It'll be exciting to see the final product.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-28 Thread David Gerard
On 28 February 2010 21:33, Domas Mituzas  wrote:

> these numbers seriously kick ass. I still can't believe I observe 2000 
> mediawiki requests/s from a single box ;-)


So ... how restricted is HipHop PHP, and what are the hotspots in
MediaWiki that would most benefit from it?


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-28 Thread Domas Mituzas
> 
> Nevertheless - a process isn't the same process when it's going at 10x
> the speed. This'll be interesting.

not 10x. I did concurrent benchmarks for API requests (e.g. opensearch) on 
modern boxes, and saw:

HipHop: Requests per second:1975.39 [#/sec] (mean)
Zend: Requests per second:371.29 [#/sec] (mean)

these numbers seriously kick ass. I still can't believe I observe 2000 
mediawiki requests/s from a single box ;-)

Domas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-27 Thread Thomas Dalton
On 27 February 2010 16:41, David Gerard  wrote:
> (I'm sure the complexity of templates will go up to compensate, unless
> Tim's parser functions reaper is set down to match, muwahaha.)

Speeding up parsing will reveal a new bottleneck for the devs to fight
the enwiki community over, don't worry about that.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-27 Thread David Gerard
On 28 February 2010 00:30, Roan Kattouw  wrote:
> 2010/2/27 David Gerard :

>> The parser typically takes 2-10 seconds on an uncached en:wp page, so
>> speeding that process up 1000x is, um, HOLY CRAP!

> You're comparing apples with oranges. Domas was testing a simple API
> page info query, which is much more lightweight than a full-blown
> parse involving enwiki's crazy templates.


So I saw from Domas's followup :-)

Nevertheless - a process isn't the same process when it's going at 10x
the speed. This'll be interesting.

(I'm sure the complexity of templates will go up to compensate, unless
Tim's parser functions reaper is set down to match, muwahaha.)


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-27 Thread Roan Kattouw
2010/2/27 David Gerard :
> The parser typically takes 2-10 seconds on an uncached en:wp page, so
> speeding that process up 1000x is, um, HOLY CRAP!
>
You're comparing apples with oranges. Domas was testing a simple API
page info query, which is much more lightweight than a full-blown
parse involving enwiki's crazy templates.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-27 Thread Domas Mituzas
Hi!

> For those of us not familiar with MediaWiki benchmarking, what kind of
> times were you getting without hiphop?

Zend: 

> Domas, how much hacking did you have to do to MediaWiki to get it to
> compile in Hiphop?

Lots. I'm trying to get basic functionality/prototypes work.
Some changes had to be done to HipHop itself, some had to be done to generated 
code, some had to be done to MediaWiki. 

MediaWiki's "run wherever I can" dynamic adaptation to any environment isn't 
too helpful sometimes...

Domas


P.S. Zend: 

Concurrency Level:  1
Time taken for tests:   1.444158 seconds
Complete requests:  100
Failed requests:0
Write errors:   0
Total transferred:  138020 bytes
HTML transferred:   109600 bytes
Requests per second:69.24 [#/sec] (mean)
Time per request:   14.442 [ms] (mean)
Time per request:   14.442 [ms] (mean, across all concurrent requests)
Transfer rate:  92.79 [Kbytes/sec] received

Connection Times (ms)
  min  mean[+/-sd] median   max
Connect:00   0.0  0   0
Processing:14   14   0.0 14  14
Waiting:   10   12   1.7 14  14
Total: 14   14   0.0 14  14
WARNING: The median and mean for the waiting time are not within a normal 
deviation
These results are probably not that reliable.

Percentage of the requests served within a certain time (ms)
  50% 14
  66% 14
  75% 14
  80% 14
  90% 14
  95% 14
  98% 14
  99% 14
 100% 14 (longest request)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-27 Thread David Gerard
On 27 February 2010 19:58, Thomas Dalton  wrote:

> For those of us not familiar with MediaWiki benchmarking, what kind of
> times were you getting without hiphop?


The parser typically takes 2-10 seconds on an uncached en:wp page, so
speeding that process up 1000x is, um, HOLY CRAP!

(Hosting providers sell network bandwidth and disk space; I can see
them putting resources into Hiphopifying the common PHP crapware just
to use 1/1000 the CPU.)

Domas, how much hacking did you have to do to MediaWiki to get it to
compile in Hiphop?


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] hiphop! :)

2010-02-27 Thread Thomas Dalton
For those of us not familiar with MediaWiki benchmarking, what kind of
times were you getting without hiphop?

On 27 February 2010 11:37, Domas Mituzas  wrote:
> 
>
> r...@flack:/hiphop/web/phase3/includes# ab -n 100 -c 1 
> 'http://dom.as:8085/phase3/api.php?action=query&prop=info&titles=Main%20Page'
> This is ApacheBench, Version 2.3 <$Revision: 655654 $>
> Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
> Licensed to The Apache Software Foundation, http://www.apache.org/
>
> Benchmarking dom.as (be patient).done
>
>
> Server Software:
> Server Hostname:        dom.as
> Server Port:            8085
>
> Document Path:          
> /phase3/api.php?action=query&prop=info&titles=Main%20Page
> Document Length:        991 bytes
>
> Concurrency Level:      1
> Time taken for tests:   0.389 seconds
> Complete requests:      100
> Failed requests:        0
> Write errors:           0
> Total transferred:      116600 bytes
> HTML transferred:       99100 bytes
> Requests per second:    256.87 [#/sec] (mean)
> Time per request:       3.893 [ms] (mean)
> Time per request:       3.893 [ms] (mean, across all concurrent requests)
> Transfer rate:          292.49 [Kbytes/sec] received
>
> Connection Times (ms)
>              min  mean[+/-sd] median   max
> Connect:        0    0   0.0      0       0
> Processing:     3    4   0.2      4       4
> Waiting:        2    4   0.4      4       4
> Total:          3    4   0.2      4       4
>
> Percentage of the requests served within a certain time (ms)
>  50%      4
>  66%      4
>  75%      4
>  80%      4
>  90%      4
>  95%      4
>  98%      4
>  99%      4
>  100%      4 (longest request)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] hiphop! :)

2010-02-27 Thread Domas Mituzas


r...@flack:/hiphop/web/phase3/includes# ab -n 100 -c 1 
'http://dom.as:8085/phase3/api.php?action=query&prop=info&titles=Main%20Page'
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking dom.as (be patient).done


Server Software:
Server Hostname:dom.as
Server Port:8085

Document Path:  
/phase3/api.php?action=query&prop=info&titles=Main%20Page
Document Length:991 bytes

Concurrency Level:  1
Time taken for tests:   0.389 seconds
Complete requests:  100
Failed requests:0
Write errors:   0
Total transferred:  116600 bytes
HTML transferred:   99100 bytes
Requests per second:256.87 [#/sec] (mean)
Time per request:   3.893 [ms] (mean)
Time per request:   3.893 [ms] (mean, across all concurrent requests)
Transfer rate:  292.49 [Kbytes/sec] received

Connection Times (ms)
  min  mean[+/-sd] median   max
Connect:00   0.0  0   0
Processing: 34   0.2  4   4
Waiting:24   0.4  4   4
Total:  34   0.2  4   4

Percentage of the requests served within a certain time (ms)
  50%  4
  66%  4
  75%  4
  80%  4
  90%  4
  95%  4
  98%  4
  99%  4
 100%  4 (longest request)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l