On Fri, Aug 30, 2019 at 10:09 PM Krinkle wrote:
> For anything else, it doesn't really work in my experience because PHPUnit
> won't actually provide a valid implementation of the interface. It returns
> null for anything, which is usually not a valid implementation of the
> contract the class
On Thu, Aug 29, 2019 at 5:30 PM Daniel Kinzler wrote:
> But subclassing across module boundaries should be restricted to classes
> explicitly documented to act as extension points. If we could enforce this
> automatically, that would be excellent.
Well, for classes that aren't supposed to be
On Thu, Aug 29, 2019 at 1:02 AM Krinkle wrote:
> What did you want to assert in this test?
In a proper unit test, I want to completely replace all non-value
classes with mocks, so that they don't call the actual class' code.
This way I can test the class under test without making assumptions
On Wed, Aug 28, 2019 at 7:24 PM Lucas Werkmeister
wrote:
> As far as I can tell, it actually strips final tokens from *any* PHP file
> that’s read, including by application code.
Yes, but only if you turn it on, and we'd only turn it on for tests.
> It seems to override the
> standard PHP
On Tue, Aug 27, 2019 at 11:53 PM Daimona wrote:
> Personally, I don't like these limitations in PHPUnit and the like. IMHO,
> they should never be a reason for changing good code.
I don't like these limitations either, but testing is an integral part
of development, and we need to code in a way
I see that in some classes, like WANObjectCache, most methods are declared
final. Why is this? Is it an attempt to optimize?
The problem is that PHPUnit mocks can't touch final methods. Any ->method()
calls that try to do anything to them silently do nothing. This makes
writing tests harder.
If
On Mon, Oct 8, 2018 at 9:04 AM Kunal Mehta wrote:
> In preparation for Wikimedia production switching to PHP 7.2, we need
> to get CI running using 7.2 (and for the rest of the MediaWiki world
> too!). But before we can do that, we'll need 7.1 to be passing first.
Did you mean 7.1 here instead
On Sun, Aug 12, 2018 at 4:48 AM, Brion Vibber wrote:
> While working on some maintenance scripts for TimedMediaHandler I've been
> trying to make it easier to do scripts that use multiple parallel processes
> to run through a large input set faster.
>
> My proposal is a ForkableMaintenance class,
On Thu, Aug 9, 2018 at 2:13 AM, MZMcBride wrote:
> Are you sure about that? I think the Code of Conduct Committee _is_
> arguing that it's the use of the word "fuck" that was problematic here. If
> I had written "Why did you do that?!" instead of "What the fuck.", do you
> think I would have had
On Wed, Aug 8, 2018 at 11:29 PM, Amir Ladsgroup wrote:
> 3) not being able to discuss cases clearly also bothers me too as I can't
> clarify points. But these secrecy is there for a reason. We have cases of
> sexual harassment in Wikimedia events, do you want us to communicate those
> too? And if
On Wed, Aug 8, 2018 at 4:01 PM, Alex Monk wrote:
> So are we supposed to be careful about using 'wtf' now?
I don't think saying "WTF" to someone, especially spelled out, is
usually conducive to eliciting a constructive response from them.
Something like "Could you please explain why you did
On Tue, Jul 31, 2018 at 3:46 PM, Stephan Gambke wrote:
> I agree that there is a trade-off to be done.
> I disagree about expecting code to be put where it is visible to core
> developers. I do appreciate that you go and look for where the
> functionality that you are working on is used outside
On Tue, Jul 31, 2018 at 10:43 AM, Stephan Gambke wrote:
> There are three "probability qualifiers" in that sentence (seems, probable,
> basically). You just don't know if somebody now has to fix their code.
Correct -- I can't know, because they didn't put their code in a place
where code search
On Mon, Jul 30, 2018 at 4:28 PM, Stephan Gambke via Wikitech-l
wrote:
> There have been several proposals for removal without deprecation in the past
> few days, each with more impact on existing code - from "virtually certain
> nobody ever used" to "the fix is trivial".
>
> I know it is
Patch to remove: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/447647
Bug: https://phabricator.wikimedia.org/T200247
Existing uses:
https://codesearch.wmflabs.org/search/?q=MagicWord%3A%3AclearCache=nope==
I'm in the middle of creating a MagicWordFactory service to replace
the MagicWord
Patch to remove: https://gerrit.wikimedia.org/r/449022
Bug: https://phabricator.wikimedia.org/T200643
Commit message of patch copied here for your convenience:
Remove long-dead OutputPage methods set/getPageTitleActionText()
They were accidentally made non-functional in April 2009 by commit
Patch to remove: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/447629
In the course of writing more tests for OutputPage, I came across the two
methods addMetadataLink() and getMetadataAttribute() that didn't make sense
to me. After a bit of digging, I found they were added in 2004
On Mon, Jul 2, 2012 at 7:36 PM, Rob Lanphier ro...@wikimedia.org wrote:
That plan may be more conservative than we need to be, given it's been
enabled on mediawiki.org for so long. At the time Aryeh wrote that,
the feature hadn't been as well tested as it is now. That's not to
say that we
On Sat, Sep 3, 2011 at 12:33 AM, Rob Lanphier ro...@wikimedia.org wrote:
I generally suspect that a standard index is going to be a waste for
the most urgent uses of this. It will rarely be interesting to search
for common hashes between articles. The far more common case will be
to search
On Sat, Sep 3, 2011 at 1:45 PM, Jeroen De Dauw jeroended...@gmail.com wrote:
What's the reason for this line? Why truncate after 500 chars?
SQL queries can be extremely, extremely long, running to many
kilobytes or even a megabyte plus. Platonides is right that one major
culprit would be saving
On Tue, Aug 16, 2011 at 7:56 PM, Brion Vibber br...@pobox.com wrote:
I'm not entirely convinced it's necessary at this stage though; HTML 5 draft
spec has some wishy-washy language about obsolete non-conforming features
that authors must not use[1], but I'd be a bit surprised if browsers are
On Tue, Aug 9, 2011 at 9:50 PM, John Elliot j...@jj5.net wrote:
I wasn't sure if the empty ul was valid HTML5, or if the validator
wasn't strict enough about it yet. In any event, I'm happy to take your
word for it.
You don't have to:
Content model: Zero or more li elements.
(Sorry if this winds up getting to the list twice, resending because I
think the first got lost in a moderation queue or something.)
On Thu, Aug 4, 2011 at 2:10 AM, Asher Feldman afeld...@wikimedia.org wrote:
From the orig post Recent Intel CPU has a fature called
On Fri, Jul 29, 2011 at 9:36 AM, Philip Tzou philip@gmail.com wrote:
Here is the test case:
http://zh.wikipedia.org/wiki/User:PhiLiP/sourcetest
So, is it a bug or a feature? If it's a feature, how can I put source
content into dd in every case?
Use explicit dl, dd, and dt. All the list
On Fri, Jul 29, 2011 at 1:10 PM, Philip Tzou philip@gmail.com wrote:
But without Tidy support, the source tag would break dd tag, cause the
dd unclosed, ruin everything.
source is a magic tag that's interpreted during parsing and
converted to HTML. Tidy only sees the resulting HTML, and
On Sat, Jul 9, 2011 at 3:49 AM, Niklas Laxström
niklas.laxst...@gmail.com wrote:
It exists and it's called data-sort-value attribute. And it's already
live on Wikipedia as far as I can see.
data-* attributes are only valid in HTML5 and will not work until
$wgHtml5 is set to true. As has been
On Wed, Jul 6, 2011 at 11:27 PM, Alexander alx...@gmail.com wrote:
IIRC, all modern browsers support hash linking to any element with an id
attribute.
Where modern means something like since IE5, yes. All browsers we
care about even slightly support linking to id's, and most browsers we
don't
On Wed, Jun 15, 2011 at 8:46 AM, Alec Conroy alecmcon...@gmail.com wrote:
We could directly ask them to tell us, but upon reflection, the
information is already hidden in our database. A multilingual user is
one that actively edits two projects of different languages.
That doesn't follow.
On Wed, Jun 15, 2011 at 10:34 AM, Alec Conroy alecmcon...@gmail.com wrote:
Is there an easy way to run this:
For each of the 86,000 'active users':
Store a list for their edit counts on each project they've edited
That's actually a fairly small dataset, and it would get us all the
data
On Fri, Jun 10, 2011 at 6:16 AM, Robert Stojnic rainma...@gmail.com wrote:
Google is not aware of this either. It works for certain queries like
wikipedia (probably because many people misspell it in Cyrillic for
fun), but try a more general query (e.g. университз оф оџфорд), and it
won't
On Mon, Jun 6, 2011 at 12:09 AM, MZMcBride z...@mzmcbride.com wrote:
Aryeh mentioned that Wikimedia wikis still aren't using HTML5. I know it was
enabled a few times and is currently disabled. As far as I remember, the
biggest issue was third-party tools and such, which seemed solvable by
On Sun, Jun 5, 2011 at 6:22 PM, Platonides platoni...@gmail.com wrote:
If the user has a stub threshold different than 0, SELECT
page_namespace,page_title FROM pagelinks JOIN page ON ... WHERE
pl_from=$this-mArticleId AND page_len $user-getStubThreshold();
For each of those stub pages: $a =
On Sun, Jun 5, 2011 at 2:25 PM, MZMcBride z...@mzmcbride.com wrote:
I think he's talking about changing the default value of
$wgDefaultUserOptions's stubthreshold on a per-wiki basis. It's currently
0 everywhere, as far as I know.
We could do that if we wanted to disable the parser cache for
On Sun, Jun 5, 2011 at 4:46 PM, Niklas Laxström
niklas.laxst...@gmail.com wrote:
Why don't we just add the numbers to html and let JavaScript make
sense out of them?
We can't put the numbers in any form in the parser cache, because
they'll become outdated as soon as the length of the linked
On Fri, Jun 3, 2011 at 4:02 PM, David Gerard dger...@gmail.com wrote:
http://schema.org/
An initiative by Google, Yahoo and Bing to make a tag language to make
things more findable in search engines.
Is there anything in this for us? schema.org tags in templates?
Presumably this would
On Thu, Jun 2, 2011 at 10:22 PM, Tim Starling tstarl...@wikimedia.org wrote:
No, I think extension updates can be backported more regularly, as
long as the changes are tested, and the potential impact is limited.
What about core updates that are tested and have limited potential
impact? Like,
2011/6/3 Jon Harald Søby jhs...@gmail.com:
The only reason I can see for not allowing embedding is that
embedding would be promoting YouTube
Embedding YouTube videos in Wikimedia content would send IP addresses
and other information about Wikimedia users to Google. This is
against Wikimedia's
On Wed, Jun 1, 2011 at 11:50 PM, MZMcBride z...@mzmcbride.com wrote:
I'm still trying to understand the nature of this problem. I think that's
what's bothering me the most at the moment. It's frustrating that I still
can't quite figure out exactly what the issue is [with code deployment being
On Wed, Jun 1, 2011 at 12:38 PM, Rob Lanphier ro...@wikimedia.org wrote:
The genesis of the 72-hour idea was our discussion about what is
stopping us from pre-commit review. The set of us in the office
discussing this felt it was pretty much just a tools issue; from a
policy point of view, we
On Wed, Jun 1, 2011 at 4:06 PM, Brion Vibber br...@pobox.com wrote:
This isn't ready for core yet -- code looks a bit dense and we don't
understand some of what it's doing yet. Can you help explain why you coded
it this way and add some test cases?
IMO, this is totally fine as a reason to
On Wed, Jun 1, 2011 at 5:02 PM, Brion Vibber br...@pobox.com wrote:
When someone looks at your commit within ~72 hours and reverts it because
nobody's yet managed to figure out whether it works or not and it needs more
research and investigation... what was the reason for the revert?
Because
On Fri, May 13, 2011 at 7:57 PM, Daniel Friesen
li...@nadir-seen-fire.com wrote:
Doesn't look that bad...
- Some arcane maintenance scripts.
- Some .js that can't interact with Title working with urls.
- The expected User, Title, Parser, file related, etc... core api stuff
that's easy to
On Fri, May 13, 2011 at 3:31 AM, M. Williamson node...@gmail.com wrote:
I still don't think page titles should be case sensitive. Last time I asked
how useful this really was, back in 2005 or so, I got a tersely-worded
response that we need it to disambiguate certain pages. OK, but how many
On Fri, May 6, 2011 at 10:55 AM, Bryan Tong Minh
bryan.tongm...@gmail.com wrote:
Can we stop discussing this issue? I believe that most MediaWiki
developers are in fact not interested in changing the status quo with
regards to licensing, so there is no point in discussing it.
Agreed.
On Fri, May 6, 2011 at 5:33 PM, Heiko Nardmann
heiko.nardm...@itechnical.de wrote:
Is it possible to have a version control system, e.g. Subversion or Git, as
the backend for MediaWiki instead of a DBMS ?
Unlikely.
If not possible: is this due to data model mismatches or just missing man
On Wed, May 4, 2011 at 8:04 PM, Ryan Lane rlan...@gmail.com wrote:
See the gnu faq on this:
http://www.gnu.org/licenses/gpl-faq.html#LinkingWithGPL
If you link, you must use a GPL compatible license.
Yes, but that's not specific to linking. Nothing in the license
proper distinguishes
On Tue, May 3, 2011 at 7:45 PM, Ryan Lane rlan...@gmail.com wrote:
You'd have an issue with a proprietary application using the wikitext
parser as a library? You really find the LGPL completely unacceptable
in this situation?
I prefer to license my own code under GPL instead of the the LGPL,
On Tue, May 3, 2011 at 10:25 AM, Paul Houle p...@ontology2.com wrote:
Note that there is a PHP tokenizer built into PHP which makes it
straightforward to develop tools like this in PHP:
http://php.net/manual/en/book.tokenizer.php
A practical example can be found here
On Tue, May 3, 2011 at 2:11 PM, Domas Mituzas midom.li...@gmail.com wrote:
Which of course allows me to fork the thread and ask why does MediaWiki have
to be GPL licensed.
Because all it takes is one developer with substantial contributions
who doesn't want to relicense, and then you have to
On Fri, Apr 29, 2011 at 9:05 AM, Philip Tzou philip@gmail.com wrote:
It is just a generator of ZhConversion.php. Run makefile.py would generate a
new ZhConversion.php.
So the only people who need the file would be developers, who will not
want to use packaged versions anyway, and so it
On Fri, Apr 15, 2011 at 9:41 PM, Brion Vibber br...@pobox.com wrote:
But it's also true that there were other bugs buried in code that had been
changed 8-9 months previously, making it harder to track them down -- and
much more difficult to revert them if a fix wasn't obvious. I certainly
On Wed, Apr 6, 2011 at 3:15 AM, Alex Brollo alex.bro...@gmail.com wrote:
I saved the HTML source of a typical Page: page from it.source, the
resulting txt file having ~ 28 kBy; then I saved the core html only, t.i.
the content of div class=pagetext, and this file have 2.1 kBy; so
there's a
On Tue, Apr 5, 2011 at 9:02 AM, Magnus Manske
magnusman...@googlemail.com wrote:
Yes, it doesn't do template/variable replacing, and it's probably full
of corner cases that break; OTOH, it's JavaScript running in a
browser, which should make it much slower than a dedicated server
setup running
On Mon, Apr 4, 2011 at 9:28 AM, Happy-melon happy-me...@live.com wrote:
If people think it would be
better as a special page we'd make
http://foo.example.com/w/index.php?title=Baraction=edit a hard redirect to
Special:Edit/Bar; that has the significant advantage of being able to be
formed as
On Mon, Mar 28, 2011 at 10:47 PM, Ryan Kaldari rkald...@wikimedia.org wrote:
In case no one has mentioned this, changing the DOCTYPE has a pretty
huge effect on how CSS gets rendered. Wikimedia's current DOCTYPE (XHTML
transitional) maps to almost standards mode or limited quirks mode
in
On Mon, Mar 28, 2011 at 9:33 PM, Tim Starling tstarl...@wikimedia.org wrote:
Yes, that's true, and that's part of the reason I'm flagging this
change on the mailing list. Domas says that the HipHop team is working
on PHP 5.3 support, so maybe the issue won't come up. But yes, in
principle, I
On Tue, Mar 29, 2011 at 2:06 PM, Ryan Kaldari rkald...@wikimedia.org wrote:
Your analysis of the effects of the DOCTYPE change is not correct. As
Entlinkt tried to point out at the HTML5 page on mediawiki.org, inline
images, inline-blocks and inline-tables can also be affected (even outside
On Fri, Mar 25, 2011 at 11:56 PM, Mark A. Hershberger
mhershber...@wikimedia.org wrote:
As far as I can see, the main reason that people think code review
works better under GIT is because the committer is responsible for
getting xyr[2] code reviewed *before* it is merged. The committer is
On Sun, Mar 27, 2011 at 11:21 PM, Tim Starling tstarl...@wikimedia.org wrote:
Facebook now write their PHP code to target HipHop exclusively, so by
trying to write code that works on both platforms, we'll be in new
territory, to some degree. Maybe that's scary, but I think it can work.
What
On Mon, Mar 28, 2011 at 10:47 AM, Tim Starling tstarl...@wikimedia.org wrote:
We can use features from both, using function_exists(), like what we
do now with PHP modules.
Well, yes, if there's some reasonable fallback. It doesn't work for
features that are useless if you have to write a
On Mon, Mar 28, 2011 at 4:31 PM, Rob Lanphier ro...@wikimedia.org wrote:
Right now, we have a system whereby junior developers get to commit whatever
they want, whenever they want. Under the system you outline, the only
remedy we have to the problem of falling behind is to throw more senior
On Fri, Mar 25, 2011 at 3:19 PM, Neil Kandalgaonkar ne...@wikimedia.org wrote:
Long story short, we had this discussion in IRC... some people find the
concept of AJAX login really alarming from a security perspective, but I
think there could (COULD) be some ways to compromise there. There is a
On Tue, Mar 22, 2011 at 10:46 PM, Tim Starling tstarl...@wikimedia.org wrote:
The tone is quite different to one of the first things I read about
Mercurial:
Oops! Mercurial cut off your arm!
Don't randomly try stuff to see if it'll magically fix it. Remember
what you stand to lose, and set
On Thu, Mar 24, 2011 at 2:00 PM, Roan Kattouw roan.katt...@gmail.com wrote:
2) Resolving conflicts between patches is done by reviewers when they
apply them instead of being conveniently outsourced to the
author-committers
If there's a conflict, the reviewer can ask the patch submitter to
On Thu, Mar 24, 2011 at 9:22 AM, Joseph Roberts
roberts.jos...@ntlworld.com wrote:
Ah, cool. If no one minds, shouldn't [[mw:HTML5]] be editted to
reflect what h in the mainstream?
[[mw:HTML5]] only really covers the use of HTML5 markup, not other
HTML5 features. The idea was to discuss the
On Thu, Mar 24, 2011 at 9:27 PM, Happy-melon happy-me...@live.com wrote:
I think Roan hits it on the nose. Most of the problems Ashar and Neil raise
are flaws in our code review process, not flaws in the tools we use *to do*
code review. I actually think that CodeReview works quite well,
On Tue, Mar 22, 2011 at 12:05 PM, Joseph Roberts
roberts.jos...@ntlworld.com wrote:
What is the current concensus on HTML5?
http://www.mediawiki.org/wiki/HTML5
Are we going to fully support it and use as many features as we can or
are we going to keep just using javascript alterntives?
We're
On Mon, Mar 21, 2011 at 8:21 AM, Daniel Kinzler dan...@brightbyte.de wrote:
In the hope i'm not clubbing a diseased donkey, i'd like to share an idea i
ran
across: we could use the RC4-128 cypher for secure.wikimedia.org, instead of
AES256. RC4 is reportedly a lot faster (3 to 4 times the
On Wed, Mar 16, 2011 at 6:38 AM, Roan Kattouw roan.katt...@gmail.com wrote:
Normal shell users can execute all but one of the steps required for
wiki creation: root access is needed to create the DNS entry for the
new subdomain.
Why doesn't Wikimedia just set up a wildcard domain for
On Thu, Mar 17, 2011 at 8:49 PM, Platonides platoni...@gmail.com wrote:
Aryeh Gregor wrote:
On Wed, Mar 16, 2011 at 6:38 AM, Roan Kattouw roan.katt...@gmail.com wrote:
Normal shell users can execute all but one of the steps required for
wiki creation: root access is needed to create the DNS
On Wed, Mar 16, 2011 at 8:59 PM, Daniel Friesen
li...@nadir-seen-fire.com wrote:
Think again, ie8 is the next ie6... knew that before someone blogged
about it:
http://infrequently.org/2010/10/ie-8-is-the-new-ie-6/
IE8 will not survive anywhere close to as long as IE6 did, for two reasons:
1)
On Fri, Mar 11, 2011 at 5:20 AM, Mingli Yuan mingli.y...@gmail.com wrote:
I think I can not enable the $wgAllowMicrodataAttributes setting in
LocalSettings.php,
because my project hosted in Chinese Wikipedia, I think I have to wait
for the decision from Foundation.
For now, I will custom
On Fri, Mar 11, 2011 at 3:40 PM, David Gerard dger...@gmail.com wrote:
Weren't *all* bots told a while ago to use the API or risk random
arbitrary breakage? (Or am I thinking of something else?)
All bots have always been told that, but we're not serious enough
about enforcing it to actually
On Thu, Mar 10, 2011 at 6:04 AM, Mingli Yuan mingli.y...@gmail.com wrote:
I am trying to make a wiki template to render some meta data of an article.
And I want to use HTML5 Microdata to make these metadate machine-readable.
But the problem I found is that XHtml Sanitizer in MediaWiki remove
On Thu, Mar 3, 2011 at 2:37 PM, William Allen Simpson
william.allen.simp...@gmail.com wrote:
That's a terrible idea. I routinely edit (and do everything else)
without javascript running. There's really no need for it.
With JavaScript, we could skip the interstitial page when you log in
and
On Sun, Feb 20, 2011 at 12:42 PM, Ryan Lane rlan...@gmail.com wrote:
I don't think we should encourage people to run trunk in production.
I think we should encourage people to run trunk in production *if*
they're planning on keeping up with it, will report or fix problems,
and are aware that
On Wed, Feb 16, 2011 at 11:46 PM, MZMcBride z...@mzmcbride.com wrote:
If there's a way to improve the general login workflow (AJAX, CORS,
whatever), I'd like to see that implemented before this checkbox is ripped
out.
Doesn't seem hard. Why don't we set an extra cookie when you log in,
let's
On Tue, Feb 15, 2011 at 12:58 PM, Q overlo...@gmail.com wrote:
On 2/15/2011 11:34 AM, Anthony Ventresque (Dr) wrote:
Wikipedia... is that a relevant answer to your remark?
There's about 284 of those, you'll have to be a bit more specific.
Anyone who says Wikipedia, in English, in a context
On Mon, Feb 14, 2011 at 2:18 AM, Siebrand Mazeland s.mazel...@xs4all.nl wrote:
With regards to i18n support it is not clear to me how translatewiki staff
would deal with 100+1 commits to different repo's every day if core and
extensions would each be in individual repos. Can you please explain
On Sat, Feb 12, 2011 at 10:26 PM, Chad innocentkil...@gmail.com wrote:
Yeah, secure.wikimedia.org's URL scheme isn't really friendly
to outsiders. Historically, this is because SSL certificates are
expensive, and there just wasn't enough money in the budget
to get more of them for the
On Sun, Feb 13, 2011 at 8:11 PM, Mark A. Hershberger
mhershber...@wikimedia.org wrote:
This workflow is different from a DVCS. Take Linux, for example. Linus
pulls code from several lieutenants. Anyone can set up a branch of the
Linux source code and commit to it, but to get Linus to ship
On Thu, Feb 3, 2011 at 3:50 PM, George Herbert george.herb...@gmail.com wrote:
We have a few months, but by the end of 2012, any major site needs to
be serving IPv6.
Unlikely. ISPs are just going to start forcing users to use NAT more
aggressively, use tunnelling, etc. No residential client
On Thu, Feb 3, 2011 at 5:01 PM, George Herbert george.herb...@gmail.com wrote:
You're making assumptions here that the residential ISPs in the US and
Asia have stated aren't true...
I'm awfully sure the assumption customers will not pay for an
Internet connection that only connects to IPv6
On Thu, Feb 3, 2011 at 6:29 PM, George Herbert george.herb...@gmail.com wrote:
There won't be much choice when the ISPs run out of IPv4 space to
allocate new users.
As I said - we'll see it in Asia soon enough, and then the US down the
road a bit longer.
You mean, when they have so little
On Mon, Jan 31, 2011 at 4:55 PM, Trevor Parscal tpars...@wikimedia.org wrote:
Adding yet another discreet parsing step is the reverse of what a lot of
people hoping to clean up wikitext are heading towards.
What system do you propose that would retain the performance benefits
of this
On Thu, Jan 27, 2011 at 1:58 AM, Dmitriy Sintsov ques...@rambler.ru wrote:
Surely it should. In a very similar manner, I've had a trouble with
local MediaWiki installation (old 1.14, haven't checked with newer
ones), when I've created user accounts and sent these via the email,
people were
On Mon, Jan 24, 2011 at 8:51 PM, Brion Vibber br...@pobox.com wrote:
So from the August 2009 survey on English Wikipedia, that leaves 18 email
addresses out of over 3 million listed as confirmed, of which a few *might*
be deliverable addresses that could not be fixed by the user tweaking them
On Wed, Jan 26, 2011 at 3:09 PM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
Firefox 4b9 seems not to implement the latest spec update, so
foo@localhost doesn't work, but that should be an easy fix. If
someone is willing to poke at their source code, I imagine it'd be
possible to get
On Sun, Jan 23, 2011 at 4:24 PM, Maury Markowitz
maury.markow...@gmail.com wrote:
I used to think that too. Then I looked at the examples on the wiki
page on the issue. Although I find TeX rather opaque, a much worst
issue is obscurity through verbosity, which not only makes the formula
On Wed, Jan 19, 2011 at 4:15 PM, Anthony wikim...@inbox.org wrote:
No, the question is why the relevant code is totally unrelated.
Well, you might ask why we don't just (selectively) dump the page,
revision, and text tables instead of doing XML dumps -- it seems like
it would be much simpler --
On Wed, Jan 19, 2011 at 11:56 PM, Carl (CBM) cbm.wikipe...@gmail.com wrote:
The ideal solution for Wikipedia would be to move to a system in which
users with relatively modern browsers don't see images at all. There
is already a candidate for that system: MathJax. This has extensive
browser
On Fri, Jan 21, 2011 at 10:49 AM, Andrew Garrett agarr...@wikimedia.org wrote:
This is really unnecessary and unhelpful on a public mailing list. I
think we'd all be better off if snark like this were kept to private
channels.
Agreed. Or better yet, not said at all. Since we evidently no
On Wed, Jan 19, 2011 at 3:59 AM, Anthony wikim...@inbox.org wrote:
Why isn't this being used for the dumps?
Well, the relevant code is totally unrelated, so the question is sort
of a non sequitur. If you mean Why don't we have incremental
dumps?, I guess Ariel is the person to ask. I'm
On Mon, Jan 17, 2011 at 9:12 PM, Roan Kattouw roan.katt...@gmail.com wrote:
Wikimedia doesn't technically use delta compression. It concatenates a
couple dozen adjacent revisions of the same page and compresses that
(with gzip?), achieving very good compression ratios because there is
a huge
On Sun, Jan 16, 2011 at 7:16 PM, Magnus Manske
magnusman...@googlemail.com wrote:
There is the question of what browsers/versions to test for. Should I
invest large amounts of time optimising performance in Firefox 3, when
FF4 will probably be released before WYSIFTW, and everyone and their
On Sun, Jan 16, 2011 at 7:12 PM, Happy-melon happy-me...@live.com wrote:
I don't entirely understand the point of this. The plan seems to be get
a large enough fraction of 'the internet' to make a change which breaks for
some people all at the same time, so that those people get angry with the
On Mon, Jan 17, 2011 at 5:55 AM, Alex Brollo alex.bro...@gmail.com wrote:
Before I dig a little more into wiki mysteries, I was absolutely sure that
wiki articles were stored into small pieces (paragraphs?) so that a small
edit into a long long page would take exactly the same disk space than a
On Wed, Jan 12, 2011 at 6:51 PM, Tim Starling tstarl...@wikimedia.org wrote:
I think this is an exaggeration.
When I optimise the parse time of particular pages, I don't even use
my sysadmin access. The best way to do it is to download the page with
all its templates using Special:Export, and
On Tue, Jan 11, 2011 at 3:04 AM, Alex Brollo alex.bro...@gmail.com wrote:
Just another question about resources. I can get the same result with an
AJAX call or with a #lst (labeled section transclusion) call. Which one is
lighter for servers in your opinion? Or - are they they more or less
On Tue, Jan 11, 2011 at 4:05 PM, David Gerard dger...@gmail.com wrote:
http://blog.chromium.org/2011/01/html-video-codec-support-in-chrome.html
Chromium will support only Theora and VP8, Chrome to follow. Same as Firefox
4.
Chromium has never supported H.264 (unless you hack the source code
1 - 100 of 830 matches
Mail list logo