Okay. Thanks for making the extra effort.
-Robert
On Thu, Oct 29, 2015 at 6:05 AM, MZMcBride wrote:
> Robert Rohde wrote:
> >Which, after substituting "display:none;" I think translates directly to
> >the regex search:
> >
> >insource:/style[ ]*=[ ]*\&
I am happy to agree that searching the XML should be better than the local
search tool, but I still find these numbers hard to reconcile.
-Robert Rohde
[1] https://github.com/mzmcbride/dump-reports/blob/a8dbbcb3/xmldumpreader.py
[2]
https://en.wikipedia.org/w/index.php?title=Special%3ASearch&a
mp;profile=advanced
[4]
https://en.wikipedia.org/w/index.php?title=Special:Search&search=insource%3A%22style%3D%5C%22display%3A%20none%3B%5C%22%22&fulltext=Search&profile=all
On Tue, Oct 27, 2015 at 2:32 PM, MZMcBride wrote:
> Robert Rohde wrote:
> >On Mon, Oct 26, 2015
I'm not sure what your bug is, but those counts are way too high to be
accurate reflections of the wikitext in the main namespace on enwiki.
-Robert Rohde
On Mon, Oct 26, 2015 at 2:13 AM, MZMcBride wrote:
> Hi.
>
> I was curious about the prevalence and types of inline st
erent in execution, I would suggest that one needs a significant test
suite of complex pages in order to judge how bad the collateral damage is
likely to be, and ideally some set of tools to help editors fix it.
-Robert Rohde
On Thu, Aug 13, 2015 at 7:51 AM, Brian Wolff wrote:
> On 8/12
On Tue, Apr 28, 2015 at 10:31 PM, Jon Robson wrote:
>
Any community members interested in helping out here? I'm very sad the
> increase in errors wasn't picked up sooner... :-/
>
What does event_action = 'error' actually mean?
If the action is stopped by the AbuseFilter is that counted as an
unt)
Assuming the above numbers are the correct totals, then it would seem that:
* All edits up 54%
* Edits from logged in users down 20%
* Errors up 510%
* First edits by logged in users down 43%
* New account creation up by 92%
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ld be automatically detected and blocked.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
atterns that might capture most of the bots but affect far fewer
legitimate editors.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Dec 3, 2014 at 8:47 PM, Daniel Friesen
wrote:
> On 2014-12-03 8:35 PM, Robert Rohde wrote:
> > However, captchas might be useful if used in conjunction with simple
> > behavioral analysis, such as rate limiters. For example, if an IP is
> > creating a lot of acc
, but that is already true. Though reducing the prevalence of the
captcha may increase the volume of spam by some small measure, I think it
is more important that we stop erecting so many hurdles to new editors.
-Robert Rohde
On Wed, Dec 3, 2014 at 3:08 PM, Ryan Kaldari wrote:
> The main re
improvement towards the
usability of our largest pages.
-Robert Rohde
On Wed, Dec 3, 2014 at 3:23 PM, Tomasz Finc wrote:
> This is fantastic. Great job team and do put up a blog post about this.
>
> --tomasz
>
> On Wed, Dec 3, 2014 at 9:03 AM, Giuseppe Lavagetto
> wrote:
ide a pretty visual interface. One also has to
find ways to make that interface efficient and useful across a wide
spectrum of different user needs.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ly still
causes higher levels of accidental harm when operated by unfamiliar
users than I would personally be comfortable with. And of course,
there is still a long ways to go in terms of feature completeness and
usability before we can really discuss Erik's dream of having VE be
perceived as better
-07-2013, ημέρα Δευ, και ώρα 20:17 -0700, ο/η Robert Rohde
> έγραψε:
>> Various parts of Mediawiki will apply tags to specific edits in recent
>> changes and histories.
>>
>> For example, the recently introduced Visual Editor is adding Tag:
>> VisualEditor to all of i
any
mention of Tags, and I don't recall noticing them during any of the
times I've worked with dump files in the past.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Jun 5, 2013 at 8:22 PM, MZMcBride wrote:
> Is there a better way to write/debug Lua modules? Any help writing these
> modules (or simply getting a working sort module on Meta-Wiki) would be
> appreciated.
I edited your code to make the Sort Module do what I think you
intended it to do.
ts use. While obviously a bit
late in the game to be starting now, I think many people would welcome a
discussion on wiki of what best practices for the use of wikidata ought to
look like, and I'm sure your input could be valuable to that discussion
e
overhead associated with launching each #invoke instance.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
save / preview.
This does not seem to be the case with old equations, where the
caching continues to work, but only with the caching of recently added
equations.
Filed as bugzilla 45973.
-Robert Rohde
On Sat, Mar 9, 2013 at 5:21 PM, Steve Summit wrote:
> David Gerard wrote:
>> This pag
f others. And there is work underway on a
number of the more complex overhauls (e.g. {{cite}}, {{convert}}).
However, it would be nice to identify problematic templates that may
be less obvious.
-Robert Rohde
aka Dragons_flight
___
Wikitech-l mai
the full-size resolution.
That said, I agree that finding a way to expire old thumbs, or rarely
accessed thumbs, is definitely a good idea.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
s, it is pretty easy to
try to open more downloads at the same time than the limit will allow.
I suspect that is what happened here.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Sat, Sep 17, 2011 at 4:56 PM, Anthony wrote:
> On Sat, Sep 17, 2011 at 6:46 PM, Robert Rohde wrote:
>> Is there a good reason to prefer SHA-1?
>>
>> Both have weaknesses allowing one to construct a collision (with
>> considerable effort)
>
> Considerable effor
25% faster to compute.
Personally I've tended to view MD5 as more than good enough in offline analyses.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
though, the
trade-off is that you have to look at many diffs to reconstruct the
page's content. Given that hard disks are cheap, the biggest
advantage is probably really for people who want to study diffs as
their main object of study.
-Robert Rohde
___
On Tue, Oct 26, 2010 at 8:25 AM, Ariel T. Glenn wrote:
> Στις 26-10-2010, ημέρα Τρι, και ώρα 16:25 +0200, ο/η Platonides έγραψε:
>> Robert Rohde wrote:
>> > Many of the things done for the statistical analysis of database dumps
>> > should be suitable for parallelizatio
ssible and completable in a month.
-Robert Rohde
[1] http://www.mediawiki.org/wiki/Manual:CompressOld.php
On Sun, Oct 24, 2010 at 5:42 PM, Aryeh Gregor
wrote:
> This term I'm taking a course in high-performance computing
> <http://cs.nyu.edu/courses/fall10/G22.2945-001/index.html>, and I
ecessary anyway if we plan to reprocess the existing logs that don't
follow the suggested convention. (I'm assuming we don't want to
simply throw out three weeks of logs.)
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
gnore this
special case. Or is there a reason that doing so is not possible?
-Robert Rohde
On Tue, Oct 19, 2010 at 1:15 PM, Rob Lanphier wrote:
> Hi all,
>
> In diving into a problem with logging[1], we discovered that we were
> unintentionally treating several special page
variety of ways that
this information is used and managed.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ymmetric
cipher is provably immune to all known quantum computing attacks:
http://www.technologyreview.com/blog/arxiv/25629/
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
z]. Even if password strength testing
algorithms were disabled on Wikipedia sites, it would still be a nice
addition to have in the Mediawiki codebase in general.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
h the traffic logs, then I apologize for my confusion.
-Robert Rohde
[1] http://stats.grok.se/
[2] http://dammit.lt/wikistats/
On Mon, Aug 9, 2010 at 10:16 PM, Rob Lanphier wrote:
> Hi everyone,
>
> We're in the process of figuring out how we fix some of the issues in
> our
I'd rather we did move the 5500 Wikimedia-only images to
Meta. Making Commons 100% free content would be clearer to reusers
and make more general sense than having a site that is 99.92% free
content. Of course, such a move would take considerable effort for
relatively small gain, and
a in
taking a hard line with third-party software contributors over the
licensing of their extensions.
[1] http://en.wikipedia.org/wiki/GPL
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
g
resource limits (e.g. memory or storage space). The newer models tend
to avoid it by devoting more resources to browsing.
I would guess that the Javascript associated with Vector is using more
memory than the Monobook version did and this is causing an error for
people that used to ha
etting the dump
system fully working is a great milestone.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
characteristic like UA strings to try and find problems. (Not
that IP monitoring alone is sufficient either.)
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
n this kind of response really
starts to send the wrong message.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
llow people to store "private" content on Wikipedia. Of course both
issues could be dealt with if we just said unsaved drafts expired
after a month or something.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
d message boxes using CSS. I've played around with code for
wiki sites I manage to add that sort of functionality (and other
related options) as a toggle buttons on the sidebar. I don't know if
putting it in the sidebar for everyone would be okay for enwiki, etc.,
but I
PI for grabbing files and that seems like a good idea. Ultimately
the best answer may well be to take multiple approaches to accommodate
both people like you who want everything as well as people that want
only more modest collections.
-Robert Rohde
_
rying to make is that if we think about what
people really want, and how the files are likely to be used, then
there may be better delivery approaches than trying to create huge
image dumps.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
t more languages
> of Wikipedia have that gadget installed?
Local admins control the installation of gadgets. On Enwiki the process is at:
http://en.wikipedia.org/wiki/Wikipedia:Gadget
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimed
uld try to figure out what
kinds of subsets and the best way to handle them.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, Jan 8, 2010 at 8:24 AM, Gregory Maxwell wrote:
> s/terabyte/several terabytes/ My copy is not up to date, but it's not
> smaller than 4.
Top most versions of Commons files are about 4.9 TB, files on enwiki
but not Commons add another 200 GB or so.
-R
uments for why closed source
would be necessary in this case.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
rce.
However, my recollection is based on discussions years ago. On
searching, I couldn't find any policy forbidding closed source
software (is there one?). So, it is possible that closed source might
be looked on as a more acceptable possib
low of the rest of the parser
operations, there no longer is a point where the parser has done only
strip tag removal, and hence there is no point for a ParserAfterStrip
to attach itself to.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedi
Include only the current
revision" box at Special:Export if you want to get additional
revisions from the online form.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
what the criteria actually are, but I recall encountering
a dump entry where the editor's name had been suppressed (missing in
the revision) but where the revision text itself was present. (I had
an analysis script choke on this, since up to that time I had assumed
every revision would have valid
ge to
check for bugs and regressions, yes? Is that gist of how Selenium is
designed to operate?
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, Oct 23, 2009 at 7:04 AM, Roan Kattouw wrote:
> 2009/10/23 Robert Rohde :
>> Given the fairly obvious utility for data mining, it might make sense
>> for someone to extend the Mediawiki API to generate a list of template
>> calls and the parameters sent in eac
Given the fairly obvious utility for data mining, it might make sense
for someone to extend the Mediawiki API to generate a list of template
calls and the parameters sent in each case.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l
dvertised technical forum to
be set up somewhere (mediawiki.org, meta) to discuss feature requests
from the community and help communicate community desires and
priorities to the developer community. Bugzilla sort of serves that
purpose now, but I know that many people are intimidated by that
forma
to regard many existing
extensions in SVN as contribs that have never been studied in detail.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ces in rendered page
appearance, could catch code with poor performance or unexpected
behavior earlier and without requiring a manual review by the experts.
Obviously expert review will always be part of the process, but a lot
of things can be done to make their lives easier. And that in turn
#x27;s
> a problem.) Similarly, try something like this:
>
> http://en.wikipedia.org/&;
>
> I assume this kind of thing is what causes those responses.
Actually wget isn't blocked for either pageviews or action=edit based
on a test a minute ago.
> On Sun, Oct 11, 2009 a
ome clear quickly.
> D (minor)
> Are TCP/000 indeed (invalid) UDP messages ?
No idea.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
s this happen? Is it something with the version
> of PHP?
PHP 4 and PHP 5 handle objects as function parameters differently.
Mediawiki now considers PHP 5.0 or later as a prerequisite (5.1 or
later recommended). The warning you quote is a plausible consequence
of using PHP 4 under certain
browser session would sometimes become corrupted and show this
behavior. In those cases it would be fixed by restarting the browser.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
n
This is called whenever an internal / interwiki link is generated and
allows one modify the text / destination, apply CSS styles, and/or
replace the link with something else entirely.
If you are munging external links (rather than internal / int
es.
Unless expressly forbidden by the software implementation, I would
assume that "helpful" Wikipedians would very quickly write template
description templates to replace the XML and give it a "nice" wiki
interface. So, in practice, the question of whether XML is hard
Are the XML specifications intended as?
A) A required addition to current and future templates
OR
B) An optional addition to aid / facilitate the functioning of some
advanced tools
The latter case seems far more achievable than the former case.
-Robert Rohde
On Fri, Sep 25, 2009 at 12:49 PM
to www.wiki.wikipedia.org/wiki/urban_heat_island.htm - did
> this www.wiki.wikipedia.org server really exist?
Issues like this do suggest that it would be nice if the noarticletext
message had a way of making "did you mean" style guesses for common
problem like this.
-Robert Rohde
_
nswer Aryeh, yes, I paid attention to handling vandalism
reversions, and yes anons were tracked as if they were users.
I went into it expecting a result like that described in the blog
post, and came out with the opposite conclusion.
-Robert Rohde
PS. A full write-up of this analysis has been on m
edited by hand,
and rather opaque for people who have no experience with it.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Sun, Sep 20, 2009 at 8:14 PM, Tim Starling wrote:
> Robert Rohde wrote:
>> I am looking at bug 1310, which involves parser behavior such that
>> when given nested tag extensions, i.e.:
>>
>>
>> AAA
>> BBB
>> CCC
>>
>>
>> The par
On Sun, Sep 20, 2009 at 10:04 AM, David Gerard wrote:
> 2009/9/20 Robert Rohde :
>
>> However, since this is parser behavior going back to the dawn of time
>> (first reported in MW 1.4), I wanted to ask if there are known use
>> cases where the current behavior is actual
good look. For
the record, my particular interest is related to nested refs.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
te easy.
For English this is obviously true, but Erik writes scripts intended
to be language agnostic and work with all WMF projects. While
certainly possible to teach it about namespaces in the general sense,
it would take rather a bit of effort to ca
On Thu, Sep 17, 2009 at 8:58 PM, Brian wrote:
> On Thu, Sep 17, 2009 at 9:55 PM, Robert Rohde wrote:
>
>> On Thu, Sep 17, 2009 at 8:25 PM, Steve Bennett
>> wrote:
>> > On Fri, Sep 18, 2009 at 12:20 PM, Robert Rohde
>> wrote:
>> >> That particular re
On Thu, Sep 17, 2009 at 8:27 PM, Steve Bennett wrote:
> On Fri, Sep 18, 2009 at 12:41 PM, Robert Rohde wrote:
>> In practice it is very rare to have a ref be placed inside the content
>> of another ref, so the problem of nested refs will almost never come
>> up, but it is
On Thu, Sep 17, 2009 at 8:25 PM, Steve Bennett wrote:
> On Fri, Sep 18, 2009 at 12:20 PM, Robert Rohde wrote:
>> That particular result is unpublished. I could make you a list of
>> infrequently viewed articles, but it would be quite long.
>
> Could you make a list of the
e right answer is
probably to build in generic support for nested references but that
would require significant changes to Cite's data stack and probably a
couple modifications and / or new hooks in the parser itself. If I
reach a point of having more free time, this is an issue I've been
On Thu, Sep 17, 2009 at 6:24 PM, Steve Bennett wrote:
> 2009/9/18 Robert Rohde :
>> Careful, a recent analysis I did suggested that 15% of all page
>> requests for articles on Wikipedia are for topics requested less than
>> once per hour. There are a very large number of
for articles on Wikipedia are for topics requested less than
once per hour. There are a very large number of pages that rarely see
hits, but collectively the traffic to such topics is important. You
could end up biasing certain kinds of analysis if you always exclude
the rar
ever.
> The Vector skin is in MW core, and will be part of the 1.16 release.
Is there a road map somewhere for features you plan to include but
haven't gotten to yet?
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
is implemented using
> PHP, but can easily ported to C using YACC. I hope it will be ready soon.
An interpreter for what language?
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
the largest free content image repository,
though that's not the only way Flickr is used).
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
(It will compress to 12 GB or so.)
Keep in mind that these estimates don't include any images, which
would eat up massive amounts of space if you include them.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
d. The magic word caching hints could also be used to help
decide how long the post-transformed version is likely to be good for.
More importantly, it avoids the pitfalls of trying to reintroduce
parser logic in transformMsg or some other preliminary step.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ted on [[Special:WantedTemplates]] [1] and
> possibly the [[Mediawiki:Common.css]] and [[Mediawiki:Monobook]] from
> the en.wikipedia[2][3].
Speaking of which, the export interface could really benefit from an
option to include CSS and JS customizations during export.
-Robert Rohde
___
and deliberately refresh the
cache), but such changes are so rare that adding some lag on update
might be acceptable.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
form
of centralized search interface?
If someone is feeling really ambitious, one might even look at
replacing the pipermail archive with something more stable (links can
break when the index gets rebuilt) and easier to manage with respect
to things like removing private info. (There might even be worka
On Sun, Aug 2, 2009 at 9:38 AM, Andrew Garrett wrote:
>
> I can possibly poke this tomorrow, must have slipped through my
> fingers on Bug Friday :)
Well it was assigned to you and you unassigned it from yourself...
-Robert Rohde
___
W
On Mon, Jul 27, 2009 at 11:12 AM, Brion Vibber wrote:
> On 7/27/09 10:03 AM, Robert Rohde wrote:
>> Forgive me, but that seems like you'd be asking the community to do a
>> huge amount of work (moving images and updating [[File:]] calls) in
>> order to address a probl
On Mon, Jul 27, 2009 at 10:09 AM, Aryeh
Gregor wrote:
> On Mon, Jul 27, 2009 at 1:03 PM, Robert Rohde wrote:
>> Forgive me, but that seems like you'd be asking the community to do a
>> huge amount of work (moving images and updating [[File:]] calls) in
>> order to addre
ent nomenclature be preserved
but some addition system of naming, minus the confusing extensions, be
placed on top as the default.
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Sat, Jul 18, 2009 at 6:55 AM, David Gerard wrote:
> 2009/7/18 Robert Rohde :
>> On Sat, Jul 18, 2009 at 6:20 AM, David Gerard wrote:
>
>>> It'd actually be better if Google properly indexed text pages whose
>>> name ends in .jpg or whatever ... but they'
On Sat, Jul 18, 2009 at 6:55 AM, David Gerard wrote:
> 2009/7/18 Robert Rohde :
>> On Sat, Jul 18, 2009 at 6:20 AM, David Gerard wrote:
>
>>> It'd actually be better if Google properly indexed text pages whose
>>> name ends in .jpg or whatever ... but they'
t actually forbidden by our robots.txt?
>
> It'd actually be better if Google properly indexed text pages whose
> name ends in .jpg or whatever ... but they're aware we'd like that, so
> it's up to them.
Which is why my personal wiki is patched to translate the "
CentralNotice was disabled on all sites except Meta on the 11th as
part of the fight with ms1. Is there a likely timeline for when that
will be restored?
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
rm what the behavior is with Firefox 3.5, Safari 3.1/4.0,
> and Opera 10 betas; unless they're very smart about only loading what
> they need, we'll probably need to devise an extension to import
> particular fonts for a given page.
According to what I've read, Firefo
emplate and why changing a single
template should have that large an effect?
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ach
> foreach/while/for loop. If it reaches 200 (or whatever), execution is
> stopped.
Really, the ideal solution is to say the user is allowed X number of
basic operations, Y amount of memory, and Z amount of execution time,
and write an interpreter that is agnostic about how those resources
ar
> var1=foo;
> var2=bar;
>
>
> or:
>
>
>
> which is a tad more verbose but more explicit.
Makes it awfully ugly to pass the result of one template to another
template if your syntax is:
" var2="bar"/>
-Robert Rohde
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, Jun 30, 2009 at 4:01 PM, Brian wrote:
> On Tue, Jun 30, 2009 at 11:20 AM, Robert Rohde wrote:
>>
>> However,
>> given the nastiness of template syntax, I would expect no end of wiki
>> authors willing to help convert the commonly used stuff.
>>
>> -
call other
template pages. One of the few virtues of the current template code
is that it is relatively modular, with more complex templates being
built out of less complex ones. If this programming language is meant
to replace that then it would also need to be able to reference the
results of other templ
1 - 100 of 186 matches
Mail list logo