[Wikitech-l] Re: Page security extension disclaimer

2021-05-13 Thread j
This looks good.

I edited it a bit for grammar and stuff, but also revised two of the sentences 
to now say the following.

> Neither the developers nor the Wikimedia Foundation are responsible for 
> anything being leaked. By using this extension, you agree to indemnify the 
> aforementioned parties and hold potential negligence by developers harmless.
> This message is added automatically to all extensions of this nature and may 
> not reflect the actual security status of this extension.

I am not a lawyer, but as far as I am aware, an explicit waiver of negligence 
is required for a defense to a negligence lawsuit in U.S. courts. In fact, the 
word negligence being specifically stated will make any defense much stronger. 
Maybe this should be checked with WMF Legal. This sentence has been added to 
protect the developers of the extension.
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
%(web_page_url)slistinfo%(cgiext)s/%(_internal_name)s


Re: [Wikitech-l] [Wikimedia-l] Wikidata now officially has more total edits than English language Wikipedia

2019-03-20 Thread Emilio J . Rodríguez-Posada
El mié., 20 mar. 2019 a las 7:48, Ariel Glenn WMF ()
escribió:

> Only 45 minutes later, the gap is already over 2000 revsions:
>
> [ariel@bigtrouble wikidata-huge]$ python3 ./compare_sizes.py
> Last enwiki revid is 888606979 and last wikidata revid is 888629401
> 2019-03-20 06:46:03: diff is 22422
>
>
This is the escape velocity, I think that Wikipedia will never surpass
Wikidata again.

The singularity is near.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Download a wiki?

2018-05-20 Thread Emilio J . Rodríguez-Posada
WikiTeam bat signal.

Dump delivered.

2018-05-19 6:52 GMT+02:00 Bart Humphries :

> Great, thanks!
>
> I have a convention this weekend, so it'll probably be Monday
> evening/Tuesday before I can really do anything else with that dump.
>
> On Fri, May 18, 2018, 3:32 PM Federico Leva (Nemo) 
> wrote:
>
> > You're in luck: just now I was looking for test cases for the new
> > version of dumpgenerator.py:
> > https://github.com/WikiTeam/wikiteam/issues/311
> >
> > I've made a couple changes and for tomorrow you should see a new XML dump
> > at
> > https://archive.org/download/wiki-meritbadgeorg_wiki
> > (there's also https://archive.org/download/wiki-meritbadgeorg-20151017,
> > not mine )
> >
> > Federico
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikitech-l Digest, Vol 166, Issue 18

2017-05-10 Thread Adithyan J
i am wihing for an unsubcribsion..


On Wed, May 10, 2017 at 1:06 PM, 
wrote:

> Send Wikitech-l mailing list submissions to
> wikitech-l@lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> or, via email, send a message with subject or body 'help' to
> wikitech-l-requ...@lists.wikimedia.org
>
> You can reach the person managing the list at
> wikitech-l-ow...@lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikitech-l digest..."
>
>
> Today's Topics:
>
>1. logstash down on deployment-prep (Guillaume Lederrey)
>2. Re: logstash down on deployment-prep (Guillaume Lederrey)
>3. Re: Tech Talk: Wikimedia Foundation Technology and Product
>   Q&A Session #2 (Srishti Sethi)
>4. RFC discussion on Wednesday: Compacting the Revision  table
>   (Daniel Kinzler)
>5. Re: Tech Talk: Wikimedia Foundation Technology and Product
>   Q&A Session #2 (Srishti Sethi)
>6. Global language preference (Petr Bena)
>7. Re: Global language preference (Martin Urbanec)
>8. Re: Global language preference (Amir E. Aharoni)
>
>
> --
>
> Message: 1
> Date: Tue, 9 May 2017 16:49:12 +0200
> From: Guillaume Lederrey 
> To: Wikimedia developers 
> Subject: [Wikitech-l] logstash down on deployment-prep
> Message-ID:
>  com>
> Content-Type: text/plain; charset=UTF-8
>
> Hello all!
>
> While aligning elasticsearch / logstash / kibana versions on
> deployment-logstash2, I am running into some issues. Logstash does not
> want to restart...
>
> This is tracked on phabricator [1], I'll let you know as soon as I
> know what is going on...
>
> [1] https://phabricator.wikimedia.org/T163709
>
> Have fun!
>
>   Guillaume
>
> --
> Guillaume Lederrey
> Operations Engineer, Discovery
> Wikimedia Foundation
> UTC+2 / CEST
>
>
>
> --
>
> Message: 2
> Date: Tue, 9 May 2017 17:26:58 +0200
> From: Guillaume Lederrey 
> To: Wikimedia developers 
> Subject: Re: [Wikitech-l] logstash down on deployment-prep
> Message-ID:
>  mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> logstash is up and running again on deployment-prep. For those who are
> interested, the logstash plugins that we used were deployed correctly,
> present on disk, but not reloaded by logstash. The reloading part is
> triggered by puppet [1], but that code does not seem to be robust
> enough.
>
> Logstash was down and logs were not collected between ~12:40 and 15:15 UTC.
>
> All my apologises for this inconvenience
>
> [1] https://github.com/wikimedia/puppet/blob/production/
> modules/logstash/manifests/init.pp#L35-L42
>
> On Tue, May 9, 2017 at 4:49 PM, Guillaume Lederrey
>  wrote:
> > Hello all!
> >
> > While aligning elasticsearch / logstash / kibana versions on
> > deployment-logstash2, I am running into some issues. Logstash does not
> > want to restart...
> >
> > This is tracked on phabricator [1], I'll let you know as soon as I
> > know what is going on...
> >
> > [1] https://phabricator.wikimedia.org/T163709
> >
> > Have fun!
> >
> >   Guillaume
> >
> > --
> > Guillaume Lederrey
> > Operations Engineer, Discovery
> > Wikimedia Foundation
> > UTC+2 / CEST
>
>
>
> --
> Guillaume Lederrey
> Operations Engineer, Discovery
> Wikimedia Foundation
> UTC+2 / CEST
>
>
>
> --
>
> Message: 3
> Date: Tue, 9 May 2017 09:05:10 -0700
> From: Srishti Sethi 
> To: Wikimedia developers ,  Wikimedia
> Mailing List 
> Cc: Megan Neisler 
> Subject: Re: [Wikitech-l] Tech Talk: Wikimedia Foundation Technology
> and Product Q&A Session #2
> Message-ID:
>  gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> REMINDER: This talk starts in 1 hour.
>
> On Mon, May 1, 2017 at 3:54 PM, Srishti Sethi 
> wrote:
>
> > Hello everyone,
> >
> > Please join us for the Wikimedia Foundation Technology and Product Q&A
> > Session #2 by Victoria Coleman (CTO) and Toby Negrin (Interim VP of
> > Product) on May 9, 2017, at 17:00 UTC via YouTube live.
> >
> > Link to live YouTube stream: https://www.youtube.com/watch?v=Q4kfgU9SZcg
> >
> > IRC channel for questions/discussion: #wikimedia-office
> >
> > More details:
> >
> > This talk is a follow-up of the Wikimedia Developer Summit session
> >  and will address the next
> > set of questions gathered via a voting survey for the summit:
> >
> >
> >-
> >
> >For WMF dev teams, what is the right balance between pushing own work
> >versus seeking and supporting volunteer contributors?
> >
> >
> >
> >-
> >
> >Do we have a plan to bring our developer documentation to the level of
> >a top Internet website, a major free software project?
> >
> >
> >
> >-
> >
> >How can volunteers bring ideas and influence t

Re: [Wikitech-l] renaming Wikimedia domains

2015-08-26 Thread J. Zedlik
Thank you for raising the discussion!

On Wed, Aug 26, 2015 at 1:41 PM, This, that and the other <
at.li...@live.com.au> wrote:
> I also think many of the communities, especially if small, would view a
brief period of downtime as an acceptable tradeoff
> for having their domain name corrected. Especially "be-x-old", I've
always thought that one was pretty ugly, and wouldn't
> be surprised if the Taraškievica Belarusian Wikipedia community felt the
same way.

On behalf of the be-x-old community, I can ensure that such a rename is
really demanded and would find a great support, even if this causes hours
or days of downtime. If a formal community discussion on this topic is
required, it can definitely be arranged.

By the way, back in 2007, be-x-old was already renamed once, from be to
be-x-old. As far as I remember, this caused periods of downtime and
read-only, however eventually all the links, interwikis and templates were
corrected manually or semi-automatically, so this added some work but was
not a big problem back then.

Cheers,
zedlik


On Wed, Aug 26, 2015 at 1:41 PM, This, that and the other <
at.li...@live.com.au> wrote:

> Thanks for bringing this up, Amir.
>
> I would point out that since there are such a lot of wikis waiting to be
> renamed, there is an opportunity for economy of scale here. If all the
> departments/people you list were able to set aside a couple of days to sit
> down together and rename the 15+ wikis waiting to be renamed, having
> figured out a process and renamed a trial wiki beforehand, I think it could
> be made worthwhile.
>
> I also think many of the communities, especially if small, would view a
> brief period of downtime as an acceptable tradeoff for having their domain
> name corrected. Especially "be-x-old", I've always thought that one was
> pretty ugly, and wouldn't be surprised if the Taraškievica Belarusian
> Wikipedia community felt the same way.
>
> I agree that getting community engagement/community liaisons involved and
> talking with relevant developers/ops folks and the affected editing
> communities would be a good next step.
>
> TTO
>
> (Sorry for not replying inline: my news client is pretty dumb, as you can
> probably guess from the header line below.)
>
> --
> "Jaime Crespo"  wrote in message news:CABaSSrL0hm1T9sHd-qh1npK8J-OPUPttBr6_
> mudan3c5bwu...@mail.gmail.com...
>
>
> (this is not an official response, just my opinion after some research on
> the topic)
>
> Due to internal (and growing) complexity of the mediawiki software, and WMF
> installation (regarding numerous plugins and services/servers), this is a
> non trivial task. It also involves many moving pieces and many people-
> network admins (dns), general operations (load control/downtime), dbas
> (import/export), services, deployment engineers and developers (mediawiki
> configuration changes, patches).
>
> What's worse, is that it would almost certainly create downtime for the
> wikis involved (not being able to edit) -specially given that it is not a
> common operation-, and some of them are smaller communities, and I would be
> worried be to annoy or discourage editing on those wikis (when we want the
> opposite!).
>
> It would be great to have someone in contact with the community so that we
> can identify which sites have a great consensus about renaming the wiki,
> and are perfectly informed about the potential problems and still are ok to
> go forward. Maybe someone in Community engagement can evaluate risks vs.
> return?
>
> On Wed, Aug 26, 2015 at 9:53 AM, Antoine Musso  wrote:
>
> Le 26/08/2015 07:20, Amir E. Aharoni a écrit :
>> > In the past when requests to rename such domains were raised, the usual
>> > replies were along the lines of "it's impossible" or "it's not worth the
>> > technical effort", but I don't know the details.
>> >
>> > Is this still correct in 2015?
>>
>> As pointed out: https://phabricator.wikimedia.org/T21986
>>
>> For what it is worth, in 2011 JeLuF wrote a list of actions needed to
>> rename a wiki.  It is outdated nowadays but that is sufficient to state
>> renaming a wiki is a non-trivial task:
>> https://wikitech.wikimedia.org/wiki/Rename_a_wiki
>>
>> It would surely consume a lot of engineering time to come up with a
>> proper migration plan and actually conduct them.  I am not sure it is
>> worth the time and money unfortunately.
>>
>> --
>> Antoine "hashar" Musso
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>>
>
>
> --
> Jaime Crespo
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
_

Re: [Wikitech-l] Wikitech-l Digest, Vol 142, Issue 9

2015-05-06 Thread Mr. Donald J. Fortier II
Why are empty digests being mailed and how do I make it so I only get one a day 
and only if not empty?

Sent from Yahoo Mail on Android

From:"wikitech-l-requ...@lists.wikimedia.org" 

Date:Wed, May 6, 2015 at 8:02
Subject:Wikitech-l Digest, Vol 142, Issue 9

Send Wikitech-l mailing list submissions to
    wikitech-l@lists.wikimedia.org

To subscribe or unsubscribe via the World Wide Web, visit
    https://lists.wikimedia.org/mailman/listinfo/wikitech-l
or, via email, send a message with subject or body 'help' to
    wikitech-l-requ...@lists.wikimedia.org

You can reach the person managing the list at
    wikitech-l-ow...@lists.wikimedia.org

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Wikitech-l digest..."

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] CFP: 2nd Workshop on Linked Data Qquality at ESWC #LDQ2015

2015-03-05 Thread Amrapali J Zaveri

LDQ 2015 CALL FOR PAPERS

2nd Workshop on Linked Data Quality
co-located with ESWC 2015, Portorož, Slovenia
June 1, 2015
http://ldq.semanticmultimedia.org/ 

/*News flash: Invited talk by Prof.Dr.Felix Naumann on "Brave new data, 
revisited"*//*
*/http://ldq.semanticmultimedia.org/program/keynote_felix_naumann 



*Important Dates*
 * Submission of research papers: March 16, 2015
 * Notification of paper acceptance: April 9, 2015
 * Submission of camera-ready papers: April 30, 2015

Since the start of the Linked Open Data (LOD) Cloud, we have seen an 
unprecedented volume of structured data published on the web, in most 
cases as RDF and Linked (Open) Data. The integration across this LOD 
Cloud, however, is hampered by the ‘publish first, refine later’ 
philosophy. This is due to various quality problems existing in the 
published data such as incompleteness, inconsistency, 
incomprehensibility, etc. These problems affect every application 
domain, be it scientific (e.g., life science, environment), 
governmental, or industrial applications.


We see linked datasets originating from crowdsourced content like 
Wikipedia and OpenStreetMap such as DBpedia and LinkedGeoData and also 
from highly curated sources e.g. from the library domain. Quality is 
defined as “fitness for use”, thus DBpedia currently can be appropriate 
for a simple end-user application but could never be used in the medical 
domain for treatment decisions. However, quality is a key to the success 
of the data web and a major barrier for further industry adoption.


Despite the quality in Linked Data being an essential concept, few 
efforts are currently available to standardize how data quality tracking 
and assurance should be implemented. Particularly in Linked Data, 
ensuring data quality is a challenge as it involves a set of 
autonomously evolving data sources. Additionally, detecting the quality 
of datasets available and making the information explicit is yet another 
challenge. This includes the (semi-)automatic identification of 
problems. Moreover, none of the current approaches uses the assessment 
to ultimately improve the quality of the underlying dataset.


The goal of the Workshop on Linked Data Quality is to raise the 
awareness of quality issues in Linked Data and to promote approaches to 
assess, monitor, maintain and improve Linked Data quality.


The workshop*topics*include, but are not limited to:
 * Concepts
 *  - Quality modeling vocabularies
 * Quality assessment
 *  - Methodologies
 *  - Frameworks for quality testing and evaluation
 *  - Inconsistency detection
 *  - Tools/Data validators
 * Quality improvement
 *  - Refinement techniques for Linked Datasets
 *  - Linked Data cleansing
 *  - Error correction
 *  - Tools
 * Quality of ontologies
 * Reputation and trustworthiness of web resources
 * Best practices for Linked Data management
 * User experience, empirical studies

*Submission guidelines*
We seek novel technical research papers in the context of Linked Data 
Quality with a length of up to 8 pages (long) and 4 pages (short) 
papers. Papers should be submitted in PDF format. Other supplementary 
formats (e.g. html) are also accepted but a pdf version is required. 
Paper submissions must be formatted in the style of the Springer 
Publications format for Lecture Notes in Computer Science (LNCS). Please 
submit your paper via EasyChair at 
https://easychair.org/conferences/?conf=ldq2015 
. Submissions that do 
not comply with the formatting of LNCS or that exceed the page limit 
will be rejected without review. We note that the author list does not 
need to be anonymized, as we do not have a double-blind review process 
in place. Submissions will be peer reviewed by three independent 
reviewers. Accepted papers have to be presented at the workshop.


*Organizing Committee*
 * Anisa Rula – University of Milano-Bicocca, IT
 * Amrapali Zaveri – AKSW, University of Leipzig, DE
 * Magnus Knuth – Hasso Plattner Institute, University of Potsdam, DE
 * Dimitris Kontokostas – AKSW, University of Leipzig, DE

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global user pages deployed to all wikis

2015-02-22 Thread Emilio J . Rodríguez-Posada
2015-02-21 16:21 GMT+01:00 MZMcBride :

> Emilio J. Rodríguez-Posada wrote:
> >It seems so. In my case, I created years ago a lot of redirects to my
> >English userpage from many Wikipedia languages, and now I have to request
> >the deletion for all them. Not very useful.
>
> "Not very useful" is a slightly rude comment to make, in my opinion. You
> specifically and intentionally created local user pages on various
> Wikipedias. I imagine you and others would be rightfully upset if
> someone came along and simply overwrote your local user pages with a
> global user page without your knowledge or consent.
>
> >Can we get a special bot task in meta to request userpage deletion in
> >batches?
>
> There's discussion on Meta-Wiki about Synchbot deleting local user pages
> on a per-user, opt-in basis. I'm personally of the view that users seeking
> to un-spam the dozens or hundreds of wikis where they have created a local
> user page and done nothing more ought to clean up the "mess" themselves.
>
>
I edited/added images and managed bots (generating edit rankings and other)
in many Wikipedias. I didn't spammed anything, just had to create the
redirect userpages years ago because MediaWiki didn't offered anything
better.

If you don't know about the case, shut up.


> Instead of deletion, blanking the user page might be a neat way of
> triggering the global user page to re-appear (a version of pure wiki
> deletion). Though, of course, some users might want a 0-byte user page.
>
> MZMcBride
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global user pages deployed to all wikis

2015-02-21 Thread Emilio J . Rodríguez-Posada
2015-02-21 8:45 GMT+01:00 Pine W :

> Is it necessary to request deletion of a local user page in order to get
> the global page to be automatically transcluded?
>
> Pine
>
>
It seems so. In my case, I created years ago a lot of redirects to my
English userpage from many Wikipedia languages, and now I have to request
the deletion for all them. Not very useful.

Can we get a special bot task in meta to request userpage deletion in
batches?


> *This is an Encyclopedia* 
>
>
>
>
>
>
> *One gateway to the wide garden of knowledge, where lies The deep rock of
> our past, in which we must delve The well of our future,The clear water we
> must leave untainted for those who come after us,The fertile earth, in
> which truth may grow in bright places, tended by many hands,And the broad
> fall of sunshine, warming our first steps toward knowing how much we do not
> know.*
>
> *—Catherine Munro*
>
> On Fri, Feb 20, 2015 at 11:17 PM, MZMcBride  wrote:
>
> > Hi.
> >
> > Erwin Dokter wrote:
> > >On 20-02-2015 18:22, Dan Garry wrote:
> > >> The feature is currently deployed and working. Simply set up your
> > >>userpage
> > >> on Meta, and it'll display on all other wikis! :-)
> > >
> > >After having played with it a bit, I must conclude there is one major
> > >shortcoming.
> > >
> > >I like to list my subpages locally, but that is not possible with a
> > >global page.
> >
> > I think what you're saying here is that if your global user page contains
> > "{{Special:PrefixIndex/User:Example}}", this transclusion will expand in
> > the context of the global wiki, not in the context of the local wiki.
> >
> > >The most annoying thing is that once you create the local
> > >user page, the global one is gone forever... until you can get a local
> > >admin to delete the local copy again.
> > >
> > >It would be much more practical if this worked like Commons description
> > >pages, where one can *add* content to the local description pages in
> > >addition to the trancluded page.
> >
> > Gone forever seems a bit hyperbolic. :-)  The use-case being solved here
> > most directly is "I don't want to create my user page or a pointer to my
> > user page on over 800 wikis." I think the append model is interesting to
> > consider, but I think it would likely need to be opt-in, perhaps via
> > interwiki transclusion.
> >
> > >I also don't know why the system is so inflexible in that only one wiki
> > >can act as the global home wiki. I know there are issues with the home
> > >wiki flag, but another approach could be in the form of using
> > >{{meta:user:Edokter}}, which could point to any project.
> >
> > Right, you're basically suggesting interwiki transclusion here. This is
> > definitely a hard problem to solve, for the context reason alone.
> >
> > In discussing global user pages, someone privately criticized the
> > implementation with basically the same theme of what you're saying here.
> > Namely, that global user pages are only solving a narrow use-case and
> that
> > the more generalized problem of easy content distribution/re-use still
> > needs to be addressed. I definitely agree, but here's why I pushed this
> > project forward and why I'm happy with where we're headed:
> >
> > 1. Perfect is the enemy of the done. We have global user pages today. If
> a
> >better approach for global user pages comes along in the future, we
> can
> >switch to using that instead, for sure.
> >
> > 2. We're working on a more generalized solution:
> ><
> https://www.mediawiki.org/wiki/Requests_for_comment/Shadow_namespaces
> > >.
> >Nemo also pointed me toward  >
> >which may interest you.
> >
> > Please share your thoughts and feedback on the wiki or in Phabricator or
> > on this mailing list. I think there's consensus that we have a pattern of
> > a problem that we want to solve and any help poking and prodding at ideas
> > for solutions to this problem would be most welcome.
> >
> > MZMcBride
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global user pages deployed to all wikis

2015-02-20 Thread Emilio J . Rodríguez-Posada
Hello, thanks for this, it is a good feature.

Does it work for Wiktionary, Wikisource, etc? Does it show the same
userpage in any sister project? In that case, a trick to show different
content using a switch and the {{SITENAME}} magic word would work?

2015-02-19 2:06 GMT+01:00 Legoktm :

> Hello!
>
> Global user pages have now been deployed to all public wikis for users
> with CentralAuth accounts. Documentation on the feature is available at
> mediawiki.org[1], and if you notice any bugs please file them in
> Phabricator[2].
>
> Thanks to all the people who helped with the creation and deployment
> (incomplete, and in no particular order): Jack Phoenix & ShoutWiki, Isarra,
> MZMcBride, Nemo, Quiddity, Aaron S, Matt F, James F, and everyone who
> helped with testing it while it was in beta.
>
> [1] https://www.mediawiki.org/wiki/Help:Extension:GlobalUserPage
> [2] https://phabricator.wikimedia.org/maniphest/task/create/?
> projects=PHID-PROJ-j536clyie42uptgjkft7
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Abandoned Labs tools

2015-02-12 Thread Mr. Donald J. Fortier II
Please comment on 
https://meta.wikimedia.org/wiki/Requests_for_comment/Abandoned_Labs_tools that 
I created per https://phabricator.wikimedia.org/T87730 to create a process for 
usurping or otherwise being added as a maintainer to existing tools which 
appear to be abandoned.  As Coren points out,  the following three questions 
need some consideration:   
   - Under what criteria are tools considered "abandoned" and subject to being 
handed to someone else?
   - To whom are abandoned tools handed over?
   - What happens to credentials a tool may hold (for instance, bot accounts, 
OAUTH secrets, etc).
 I've also requested that a one-liner about this topic be added to the next 
tech news 
https://meta.wikimedia.org/w/index.php?title=User_talk:Guillaume_%28WMF%29&diff=11249286&oldid=10912536
 
 
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Upgrading to 1.23

2014-06-12 Thread Beebe, Mary J
We have several internal wikis that we maintain.  We write extensions to these 
wikis.  We are trying to convince management to upgrade our mediaWiki version 
to 1.23.x.  At the same time we will upgrade our PHP version from 5.2.8 to 
5.4.x.  We have kept our PHP version to 5.2 because of the old mediaWiki 
version.

We are currently at mediaWiki 1.15.3.  As developers we know that we are way 
overdue on upgrading, but no one has ever wanted to pay for it.

Some of the obvious things are:

1.   Both the Media wiki version  and the php version are no longer 
supported.

2.   We do not have access to the most recent extensions.

3.   Limited documentation for the old versions.

4.   General security vulnerabilities. - I would love to have any specifics 
here.

Does anyone have any other points that I could add that would make management 
say yes?  We have been reading about performance boasts.  Any specifics?

I would also take any links that may be helpful.

Thanks,

Mary

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to show the page AND section for CirrusSearch search results‏

2014-05-21 Thread J jollylittlebottom
Ok, there might be a problem when a searched term appears in a section header 
and in other parts of a page.

But we have another problem. We have long pages in our mediawiki. 
If a user search for a term he just receive the link to a page. On this page he 
has to search for the term again with the browsers search tool to find the 
right place with the term.
We will make this easier and present always additionally a link to the section 
where the term is first found.

I implemented this by adding the complete html text of all pages to the 
elasticsearch index.
As search result I receive the highlighted text together with the complete html 
text.
After extracting the highlighted term I extract the section header for this 
term from the html text and set
sectionSnippet and sectionTitle in class CirrusSearch\Result.
This works but is not really smart and not suitable for bigger wikis.

Is there a better way?
Is it possible to show all section links up the hierarchy (h1, h2, ...) too?

Many Thanks!
Henry  


  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] How to show the page AND section for CirrusSearch search results‏

2014-05-09 Thread J jollylittlebottom
If I search on http://www.mediawiki.org for "Search Weighting" I get as result 
a line:
"Search  (section Search Weighting Ideas)" with links to the page and to the 
section.

This section contains the word "GeoLoc".
But if I search for "GeoLoc" I get just the page link.

I want to show this section link as a search result too.

Is there an easy way or is it a planned feature?

What do I have to change in the CirrusSearch extension?

Many Thanks!
Henry 
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2014-01-13 Thread Emilio J . Rodríguez-Posada
Just create a page editable for everybody (as user talk pages are editable
for blocked users):

* [[Wikipedia:Edit suggestions by TOR users]]

Redirect to it with a notice when a TOR node click on edit tabs. Later, any
user can add the suggestions to the articles, if they are OK.

Anyway, TOR doesn't seem 100% secure, so perhaps a notice that they can be
tracked would be nice.

2014/1/13 Gryllida 

> On Tue, 14 Jan 2014, at 3:32, Zack Weinberg wrote:
> > On Sun, Jan 12, 2014 at 11:46 PM, Gryllida  wrote:
> > > On Mon, 13 Jan 2014, at 15:29, Gregory Maxwell wrote:
> > >> What freenode does is not functionally useful for Tor users. In my
> > >> first hand experience it manages to enable abusive activity while
> > >> simultaneously eliminating Tor's usefulness at protecting its users.
> > >
> > > The "register at real IP, then only use TOR through an account" flow
> > > implies trust in some entity (such as freenode irc network opers or
> > > Wikipedia CheckUsers). I currently believe that requiring such trust
> > > doesn't "eliminate TOR's usefullness at protecting its users".
> >
> > I rather think it does.  Assume a person under continual surveillance.
> >  If they have to reveal their true IP address to Wikipedia in order to
> > register their editor account, the adversary will learn it as well,
> > and can then attribute all subsequent edits by that handle to that
> > person *whether or not* those edits are routed over an anonymity
> > network.
>
> Doesn't it get solved if, despite the "surveillance", the trust entity
> ("freenode opers" or "wikipedia checkusers") reveals the user's IP only
> under a court order?
>
> >
> > To satisfy Applebaum's request, there needs to be a mechanism whereby
> > someone can edit even if *all of their communications with Wikipedia,
> > including the initial contact* are coming over Tor or equivalent.
>
> Rubbish. This makes a vandal inherently untrackable and unblockable.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [wikiteam-discuss:699] "Tarballs" of all 2004-2012 Commons files now available at archive.org

2013-10-13 Thread Emilio J . Rodríguez-Posada
Nice work Nemo!


2013/10/13 Federico Leva (Nemo) 

> WikiTeam has just finished archiving all Wikimedia Commons files up to
> 2012 (and some more) on the Internet Archive: https://archive.org/details/
> **wikimediacommons 
> So far it's about 24 TB of archives and there are also a hundred torrents
> you can help seed, ranging from few hundred MB to over a TB, most around
> 400 GB.
> Everything is documented at  wiki/Mirroring_Wikimedia_**project_XML_dumps#Media_**tarballs>
> and if you want here are some ideas to help WikiTeam with coding: <
> https://code.google.com/p/**wikiteam/issues/list
> >.
>
> Nemo
>
> --
> You received this message because you are subscribed to the Google Groups
> "wikiteam-discuss" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to 
> wikiteam-discuss+unsubscribe@**googlegroups.com
> .
> For more options, visit 
> https://groups.google.com/**groups/opt_out
> .
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikitech-l Digest, Vol 122, Issue 65

2013-09-26 Thread Mr. Donald J. Fortier II
A little additional information on topic 1 -- The pull request including the 
changes that this contributor to the script made can be viewed at 
https://github.com/WPAFC/afch/pull/54/files if it is of any interest.  Thanks.




 From: "wikitech-l-requ...@lists.wikimedia.org" 

To: wikitech-l@lists.wikimedia.org 
Sent: Thursday, September 26, 2013 8:01 AM
Subject: Wikitech-l Digest, Vol 122, Issue 65
 

Send Wikitech-l mailing list submissions to
    wikitech-l@lists.wikimedia.org

To subscribe or unsubscribe via the World Wide Web, visit
    https://lists.wikimedia.org/mailman/listinfo/wikitech-l
or, via email, send a message with subject or body 'help' to
    wikitech-l-requ...@lists.wikimedia.org

You can reach the person managing the list at
    wikitech-l-ow...@lists.wikimedia.org

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Wikitech-l digest..."


Today's Topics:

   1. Fwd: afch licensing question (Steven Walling)
   2. Re: Killing 1.XXwmfYY branches -- another idea? (Antoine Musso)
   3. Re: MediaWiki-Vagrant: new features (Željko Filipin)
   4. Re: Fwd: afch licensing question (Denny Vrandečić)
   5. Job posting: frontend dev (JavaScript) at Wikimedia
      Deutschland (Lydia Pintscher)
   6. Re: MediaWiki-Vagrant: new features (Ori Livneh)


--

Message: 1
Date: Wed, 25 Sep 2013 20:36:43 -0700
From: Steven Walling 
To: Wikimedia developers 
Subject: [Wikitech-l] Fwd: afch licensing question
Message-ID:
    
Content-Type: text/plain; charset=UTF-8

Forwarding, with permission.

For background the "AFC  Helper" script is one that assists English
Wikipedians reviewing pages in the Articles for Creation queue, which
currently is severely backlogged.

Any thoughts on the licensing issue, from folks with experience on the
question of gadget/userscript licensing?

-- Forwarded message --
From: Mr. Donald J. Fortier II 
Date: Wed, Sep 25, 2013 at 8:30 PM
Subject: afch licensing question
To: "swall...@wikimedia.org" 


Per the discussion on https://github.com/WPAFC/afch/issues/61 one of the
previous contributors to the project refuses to agree to relicense AFCH
under the MIT license. Right now, the script is licensed under
CC-BY-SA/GFDL, as it was originally coded on-wiki (per
[[Wikipedia:Copyright]] -- all text-based contributions).  This came about
due to some confusion that can be seen
https://github.com/WPAFC/afch/issues/60.  So, we are unsure as to where to
go from here.  If we replace any code contributed to the project by the
person that refuses to agree, can we dissolve any requirements to get him
to agree?  Is there enough contribution from him to actually worry about it
as he hasn't actually written any functions, just converted some stuff from
old school JavaScript to jQuery?  Any advice/assistance on this would be
appreciated.



-- 
Steven Walling,
Product Manager
https://wikimediafoundation.org/


--

Message: 2
Date: Thu, 26 Sep 2013 10:09:30 +0200
From: Antoine Musso 
To: wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Killing 1.XXwmfYY branches -- another idea?
Message-ID: 
Content-Type: text/plain; charset=UTF-8

Le 26/09/13 00:46, Chad a écrit :
>> >
>> > What's actually the problem with expanding branches?
>> >
> To me at least, it makes it harder to see what's actually deployed at
> a given time. Try `git branch -r` on core. We're at 86 branches now...that's
> 172 if you've got two remotes. It's only going to get worse and it'll be
> progressively harder to spot what's important.

We could simply delete the old branches after a few deployment cycles.
If we want to keep the reference for history purposes, would tagging the
tip of it be enough ?

If you find yourself struggling to find out the last 2 wmf branches, you
can write a tiny script to sort and filter the branches, keeping only
the last two.

-- 
Antoine "hashar" Musso




--

Message: 3
Date: Thu, 26 Sep 2013 10:52:38 +0200
From: Željko Filipin 
To: Wikimedia developers 
Subject: Re: [Wikitech-l] MediaWiki-Vagrant: new features
Message-ID:
    
Content-Type: text/plain; charset=UTF-8

Hi Ori,

Is there a reason for having Guest Additions Version 4.1.12 in the VM
instead of 4.2?

$ vagrant up
(...)
[default] VM booted and ready for use!
[default] The guest additions on this VM do not match the installed version
of
VirtualBox! In most cases this is fine, but in rare cases it can
cause things such as shared folders to not work properly. If you see
shared folder errors, please update the guest additions within the
virtual machine and reload your VM.

Guest Additions Version: 4.1.12
VirtualBox Version: 4.2
[default] Setting hostname...
(...)

I will file a bug

Re: [Wikitech-l] GMail sending lots of WIkimedia mail to spam again

2013-08-06 Thread Emilio J . Rodríguez-Posada
Meanwhile you can fix the filters and add the option "do not send to spam
folder". I did it for all my wiki filters...


2013/8/6 Bináris 

> 2013/8/5 Risker 
>
> >
> > Really?  I've not once had that message.  As best I can tell, it is
> > affecting EVERY lists.wikimedia.org mailing list, with the possible
> > exception of the checkuser mailing list.
>
>
> CU-list is also deeply involved. :-(
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GMail sending lots of WIkimedia mail to spam again

2013-08-05 Thread Emilio J . Rodríguez-Posada
What is the explanation for this? My spam folder is full of emails from
wiki mailing lists too.

Perhaps many users don't know how to unsubscribe and mark them as spam and
Google filter has learn it?


2013/8/5 Mathieu Stumpf 

> Le lundi 05 août 2013 à 23:01 +0530, Yuvi Panda a écrit :
> > All emails to labs-l always end up in spam for me (I've a special rule
> > that picks them out of spam, and GMail still warns me).
> >
> > /end-data-point
> >
> >
> Bad mail provider, change mail provider. ;)
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How's the SSL thing going?

2013-07-31 Thread Emilio J . Rodríguez-Posada
It was so obvious that int. agencies were doing that. It was discussed in
past threads in the mailing list too.

Also, I have read that SSL is not secure neither. So, bleh...


2013/7/31 David Gerard 

> Jimmy just tweeted this:
>
> https://twitter.com/jimmy_wales/status/362626509648834560
>
> I think that's the first time I've seen him say "fuck" in a public
> communication ...
>
> Anyway, I expect people will ask us how the move to all-SSL is
> progressing. So, how is it going?
>
> (I've been telling people it's slowly moving along, we totally want
> this, it's just technical resources. But more details would be most
> useful!)
>
>
> - d.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Listing missing words of wiktionnaries

2013-07-23 Thread Emilio J . Rodríguez-Posada
http://storage.googleapis.com/books/ngrams/books/datasetsv2.html


2013/7/23 Bináris 

> Once you have a "list of words which are used on the web" (this must be got
> from an outer source, nothing to do it within Wiktionary), the easiest way
> is to run a bot, e.g. Pywikipedia.
>
> 2013/7/23 Mathieu Stumpf 
>
> > Hello,
> >
> > Here is what I would like to do : generating reports which give, for a
> > given language, a list of words which are used on the web with a number
> > evaluating its occurencies, but which are not in a given wiktionary.
> >
> > How would you recommand to implemente that within the wikimedia
> > infrastructure?
> >
> > --
> > Association Culture-Libre
> > http://www.culture-libre.org/
> >
> > __**_
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/**mailman/listinfo/wikitech-l<
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l>
>
>
>
>
> --
> Bináris
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding to the skin toolbox

2013-07-12 Thread Beebe, Mary J
I am using the BaseTemplateToolbox.  If I print-r($toolbox) at the end of my 
hook, this is added to toolbox.  

[proplinkhere] => Array
(
[text] => Prop Links Here
[href] => 
/acc_arswiki/index.php?title=Special%3AProplinkhere/Amikacin
[id] => t-proplinkhere
)

Which is what I added.  But it does not show up in the toolbox.  Do I have to 
add "t-proplinkhere" anywhere?  
 
Do I have to do anything with BaseTemplate::makeListItem?
 
I did add this to the i18n message file:

'accesskey-t-proplinkhere' => '',
'tooltip-t-proplinkhere'   => 'List all pages that have properties that 
link to this page. ',   

Thanks,

Mary

-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Daniel Friesen
Sent: Thursday, July 11, 2013 5:53 PM
To: wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Adding to the skin toolbox

BaseTemplateToolbox is the cleanest hook to use. There's also 
SkinTemplateToolboxEnd which is hideous but existed before BaseTemplate.  
So ideally use BaseTemplateToolbox, unless you happen to have an ancient 
pre-BaseTemplate skin that's still hardcoding the toolbox.

https://www.mediawiki.org/wiki/Manual:Hooks/BaseTemplateToolbox
https://www.mediawiki.org/wiki/Manual:Hooks/SkinTemplateToolboxEnd

On Thu, 11 Jul 2013 14:39:42 -0700, Beebe, Mary J 
wrote:

> I am using the Vector skin.  I have an extension very similar to 
> whatlinkshere.  It needs to take the current page as the parameter 
> just like whatlinkshere.
>
> I would like to add it to the toolbox, but actually I do not care 
> where it is in the navigation but the toolbox seems the most logical.
>
> We have done this before with other older skins that had something 
> like this section:
>  
> I noticed that getToolBox() is within BaseTemplate.  Is there a way to 
> add this method to the tool box?
>
> Thanks,
>
> Mary Beebe
> Battelle - Charlottesville, VA
> Office: 434- 951-2149
>
> **Confidentiality Notice**
> This message is intended only for the use of the individual or entity 
> to which it is addressed, and may contain information that is 
> privileged, confidential and/or otherwise exempt from disclosure under 
> applicable law. If the reader of this message is not the intended 
> recipient or the employee or agent responsible for delivering the 
> message to the intended recipient, any disclosure, dissemination, 
> distribution, copying or other use of this communication or its 
> substance is prohibited. If you have received this communication in 
> error, please return to the sender and delete from your computer system.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding to the skin toolbox

2013-07-11 Thread Beebe, Mary J
That makes sense.  I was trying to do it the hard way.  

Thanks Daniel,

Mary

-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Daniel Friesen
Sent: Thursday, July 11, 2013 5:53 PM
To: wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Adding to the skin toolbox

BaseTemplateToolbox is the cleanest hook to use. There's also 
SkinTemplateToolboxEnd which is hideous but existed before BaseTemplate.  
So ideally use BaseTemplateToolbox, unless you happen to have an ancient 
pre-BaseTemplate skin that's still hardcoding the toolbox.

https://www.mediawiki.org/wiki/Manual:Hooks/BaseTemplateToolbox
https://www.mediawiki.org/wiki/Manual:Hooks/SkinTemplateToolboxEnd

On Thu, 11 Jul 2013 14:39:42 -0700, Beebe, Mary J 
wrote:

> I am using the Vector skin.  I have an extension very similar to 
> whatlinkshere.  It needs to take the current page as the parameter 
> just like whatlinkshere.
>
> I would like to add it to the toolbox, but actually I do not care 
> where it is in the navigation but the toolbox seems the most logical.
>
> We have done this before with other older skins that had something 
> like this section:
>  
> I noticed that getToolBox() is within BaseTemplate.  Is there a way to 
> add this method to the tool box?
>
> Thanks,
>
> Mary Beebe
> Battelle - Charlottesville, VA
> Office: 434- 951-2149
>
> **Confidentiality Notice**
> This message is intended only for the use of the individual or entity 
> to which it is addressed, and may contain information that is 
> privileged, confidential and/or otherwise exempt from disclosure under 
> applicable law. If the reader of this message is not the intended 
> recipient or the employee or agent responsible for delivering the 
> message to the intended recipient, any disclosure, dissemination, 
> distribution, copying or other use of this communication or its 
> substance is prohibited. If you have received this communication in 
> error, please return to the sender and delete from your computer system.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Adding to the skin toolbox

2013-07-11 Thread Beebe, Mary J
I am using the Vector skin.  I have an extension very similar to whatlinkshere. 
 It needs to take the current page as the parameter just like whatlinkshere.

I would like to add it to the toolbox, but actually I do not care where it is 
in the navigation but the toolbox seems the most logical.

We have done this before with other older skins that had something like this 
section:
 
I noticed that getToolBox() is within BaseTemplate.  Is there a way to add this 
method to the tool box?

Thanks,

Mary Beebe
Battelle - Charlottesville, VA
Office: 434- 951-2149

**Confidentiality Notice**
This message is intended only for the use of the individual or entity to which 
it is addressed, and may contain information that is privileged, confidential 
and/or otherwise exempt from disclosure under applicable law. If the reader of 
this message is not the intended recipient or the employee or agent responsible 
for delivering the message to the intended recipient, any disclosure, 
dissemination, distribution, copying or other use of this communication or its 
substance is prohibited. If you have received this communication in error, 
please return to the sender and delete from your computer system.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Moving a page to a category page.

2013-07-11 Thread Beebe, Mary J
We would like to make pages into category pages.  If we try to move a page to a 
category namespace it will not let us.  Is there a configuration variable  to 
do that?

Thanks,
Mary


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Switching database from mysql to Oracle

2013-05-09 Thread Beebe, Mary J
Thanks for your very quick response. 

What about MSSQL? 

Thanks,

Mary Beebe
Battelle - Charlottesville, VA
Office: 434- 951-2149


-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Freako F. 
Freakolowsky
Sent: Thursday, May 09, 2013 11:43 AM
To: wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Switching database from mysql to Oracle

Well supported? ... i do try my best :D

I'm running a few farms on 1.17 and some standalones on 1.19 (i think?!), but 
i'm in the process of switching all of them to 1.21 once it's officially out 
and i manage to port some of our in-house extensions.
Oracle versions on all of them have just been upgraded to 11g R2
(11.2.0.3) SE,  but they used to run on most versions from 10g R2
(10.2.0.2) onward.

I'd suggest you go with the 1.20 or better yet wait for 1.21. You'll also have 
to up PHP to 5.3+.


LP, Jure


On 09. 05. 2013 17:30, Beebe, Mary J wrote:
> We have to switch our database from mysql to Oracle or msSql.  We notice that 
> there is a  DatabaseOracle.php in the db folder.  Is Oracle or MsSQL 
> supported well with mediaWiki?
>
> Below is our current versions:
> MediaWiki<http://www.mediawiki.org/>
>
> 1.15.3
>
> PHP<http://www.php.net/>
>
> 5.2.8 (cgi-fcgi)
>
> MySQL<http://www.mysql.com/>
>
> 5.5.19
>
>
> We know we are not current with our mediaWiki version.  From the little we 
> read the older versions may work better than the newer versions.  We are 
> willing to switch to whichever version of the software that works the best 
> with Oracle or MsSQL.
>
> Could you also point me to documentation on the databases?  At this time we 
> are not concerned about the 3rd party extensions that we have installed.  We 
> are more concerned about the core code.
>
> Thanks,
> Mary Beebe
> Battelle - Charlottesville, VA
> Office: 434- 951-2149
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Switching database from mysql to Oracle

2013-05-09 Thread Beebe, Mary J
We have to switch our database from mysql to Oracle or msSql.  We notice that 
there is a  DatabaseOracle.php in the db folder.  Is Oracle or MsSQL supported 
well with mediaWiki?

Below is our current versions:
MediaWiki

1.15.3

PHP

5.2.8 (cgi-fcgi)

MySQL

5.5.19


We know we are not current with our mediaWiki version.  From the little we read 
the older versions may work better than the newer versions.  We are willing to 
switch to whichever version of the software that works the best with Oracle or 
MsSQL.

Could you also point me to documentation on the databases?  At this time we are 
not concerned about the 3rd party extensions that we have installed.  We are 
more concerned about the core code.

Thanks,
Mary Beebe
Battelle - Charlottesville, VA
Office: 434- 951-2149


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Chunked uploading API documentation; help wanted

2012-04-18 Thread j
On 04/17/2012 10:37 PM, Brion Vibber wrote:
> I've started adding some documentation on chunked uploading via the API on
> mediawiki.org:
> 
> https://www.mediawiki.org/wiki/API:Upload#Chunked_uploading
looks good to me

> for instance -- is it
> possible to do chunked upload without using the stash? Or are they required
> together?
chunk upload requires stash since while the upload is happening, chunks
are stored in the stash, the chunks should never show up on the page.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Opening up an internal wiki - login maintenance without a mail server.

2012-02-10 Thread Beebe, Mary J
I have been working with internal wikis for a while.  We have several wikis 
that we edit within our company then give the wiki to the client.  The client 
just searches the information and does not do any further edits.

We now have a wiki that we want to make external so I need to make sure I am 
addressing everything.

As far as user groups, I have it set that everyone can read the wiki and all 
login users can edit within the wiki.  Only administrators can create a new 
user account.  We have provided an email address for people to request 
authoring privileges.  The server we are using for the wiki does not have a 
mail server.   I am assuming that we would need a mail server on the server to 
have the wiki email passwords to the person.  Is that true or can we set it to 
use another mail server?

Thanks,

Mary Beebe
Battelle Charlottesville, VA
434-951-2149
This message is intended only for the use of the individual or entity to which 
it is addressed, and may contain information that is privileged, confidential 
and/or otherwise exempt from disclosure under applicable law. If the reader of 
this message is not the intended recipient or the employee or agent responsible 
for delivering the message to the intended recipient, any disclosure, 
dissemination, distribution, copying or other use of this communication or its 
substance is prohibited. If you have received this communication in error, 
please return to the sender and delete from your computer system.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Logging & messages

2012-02-10 Thread Beebe, Mary J
I was trying to use this old extension: 
http://www.mediawiki.org/wiki/Extension:UserLoginLog to log login attempts.  It 
used $wgMessageCache; therefore it worked fine until we went to a 1.18 wiki.  I 
tried to modernize it by creating a UserLoginLog.i18n file to keep the messages.



I have $wgExtensionMessagesFiles['userLoginLog'] = dirname( __FILE__ ) . 
"/UserLoginLog.i18n.php";  at setup.



A sample method is:

function wfUserLoginLogSuccess(&$user) {

wfLoadExtensionMessages( 'userLoginLog' );

$log = new LogPage('userlogin',false);

$log->addEntry('success',$user->getUserPage(),wfGetIP());

return true;

}



The statement 'wfLoadExtensionMessages( 'userLoginLog' );' does not seem to do 
anything because the message in the log isinstead of the 
message.

I am not sure how to pass in the message log name or how to make it global now 
that wgmessageCache is no longer available.  Is there a document on how to use 
the logging feature in the wiki?

Thanks,

Mary Beebe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Northern Soto Wikipedia

2011-11-05 Thread J Alexandr Ledbury-Romanov
Firefox 7.01 on Windows 7 gives the same lines of reported errors. However,
using the same laptop but IE 9.0, the site works fine.

Alex



2011/11/5 Diederik van Liere 

> I am running Firefox 7.01 on OSX Leopard 10.6.8 and it gives a backtrace
> error.
> Diederik
>
> On Sat, Nov 5, 2011 at 10:15 AM, Ole Palnatoke Andersen <
> palnat...@gmail.com
> > wrote:
>
> > On Sat, Nov 5, 2011 at 2:56 PM, Amir E. Aharoni
> >  wrote:
> > > 2011/11/5 Andre Engels :
> > >> There seems to be a Northern Soto Wikipedia at
> http://nso.wikipedia.org,
> > at
> > >> least that's what http://incubator.wikimedia.org/wiki/Wp/nso claims.
> > >> However, when I go to that site I see the following text:
> > >>
> > >> Unstub loop detected on call of $wgLang->getCode from
> MessageCache::get
> > >>
> > >> Backtrace:
> > >> ...
> > >
> > > It works for me. Can you try again?
> > >
> >
> > Windows Vista:
> >
> > Chrome 15.0.874.106: Same experience as Andre.
> > Firefox 6.0.1, Opera 11.50, Safari 5.0.5, IE8: Same as Amir.
> >
> > - Ole
> >
> >
> > --
> > http://palnatoke.org * @palnatoke * +4522934588
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> http://about.me/diederik";>Check out my about.me profile!
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Map not displaying

2011-06-15 Thread Beebe, Mary J
We found out that the java script file was not being included.  I needed to 
change $egMapsScriptPath because we did not put the extension right under the 
extensions folder. After that it worked fine.  

Sorry I should have check java script errors first. 

Thanks, 

Mary Beebe
 

-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Jeroen De Dauw
Sent: Tuesday, June 14, 2011 4:41 PM
To: Wikimedia developers
Subject: Re: [Wikitech-l] Map not displaying

Hey,

> Is there something else I need to do.

No, it ought to work. Which it does for me, using the same version of Maps,
see http://mapping.referata.com/wiki/User:Jeroen_De_Dauw/displaymaptest

You did add the api key AFTER the inclusion of Maps right? If so, can you
link to the failing map, so I can determine what's going on?

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Map not displaying

2011-06-14 Thread Beebe, Mary J
I just installed the maps extension. - Version 0.7.6.1
Mediawiki version 1.15.3
PHP version 5.2.8 (isapi)
MySQL 5.1.40-community

I have {{#display_map:
30° 44'14 N, 76° 47' 14E | service=googlemaps}}

I received a google api key and added that to localSettings after including the 
extension.  It tries to draw the map but stops at Loading Map ...

Is there something else I need to do.

Thanks,

Mary Beebe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] WOM Extension

2011-06-09 Thread Beebe, Mary J
I am looking at the WOM extension, but I am not sure if it is the correct 
extension for us.  We want is an api call that takes a wiki page - either 
parses the wiki markup (or removes wiki markup) then sends back an xml of the 
page content.The regular wiki api gives us pages with wiki markup within it.

We tried something like this:
  api.php?action=womget&page=Somepage&xpath=//*

It did not seem to do what we expected.

Do you mind letting me know if we are looking at the correct extension?

We have a Java application that works with the page content that we get back.

Thanks,

Mary Beebe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Image.php is deprecated need to replace with something else

2011-01-11 Thread Beebe, Mary J

GetLinksTo() seems to be returning no results even though there are image pages 
with links to them.  It seems to be a problem with the select statement within 
the File class.  I looked at the query and if I run the query within mySQL it 
works if I remove the extra quotes.  

Are other people having trouble with this method?  

Mary Beebe

-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Tim Starling
Sent: Monday, January 10, 2011 6:32 PM
To: wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] Image.php is deprecated need to replace with 
something else


On 11/01/11 08:56, Beebe, Mary J wrote:
> Within one of our older internal extensions, we have these 2
> lines: $targetAsImage = Image::newFromTitle($onePage); 
> $allPagesLinkedToTarget = $targetAsImage->getLinksTo();
> 
> We were trying to get a list of wiki titles that link to an image.
> This does not seem to work anymore  with media wiki 1.15.  Now that
> it is truly deprecated, what should we replace this with?
> 

$targetAsImage = wfFindFile( $onePage );
if ( $targetAsImage ) {
$allPagesLinkedToTarget = $targetAsImage->getLinksTo();
}

wfFindFile() will return false if the image does not exist, so you
need to guard against that case to avoid a fatal error.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Image.php is deprecated need to replace with something else

2011-01-10 Thread Beebe, Mary J
Within one of our older internal extensions, we have these 2 lines:
$targetAsImage = Image::newFromTitle($onePage);
$allPagesLinkedToTarget = $targetAsImage->getLinksTo();

We were trying to get a list of wiki titles that link to an image.  This does 
not seem to work anymore  with media wiki 1.15.  Now that it is truly 
deprecated, what should we replace this with?

We are using the following:
MediaWiki 1.15.3
PHP 5.2.8 (cgi-fcgi)
MySQL 5.1.40-community

Thanks,
Mary Beebe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Shorturl Service for Wikipedia Projects by Node.js

2010-08-03 Thread j
On 08/03/2010 05:59 PM, Mingli Yuan wrote:
> It uses API call to get pageId by the title, and then convert pageId by base
> 36 to the short url. It is quite simple.
> To reduce the frequency of API call, a simple cache was used.  So far, only
> Chinese and English were supported.

you should consider base32
http://www.crockford.com/wrmg/base32.html

j

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Broken videos

2010-03-16 Thread j
On 03/16/2010 10:29 AM, Lars Aronsson wrote:
> This past weekend, at the SXSW conference, a new initiative
> was launched to "get video on Wikipedia",
> http://videoonwikipedia.org/
> 
> That sounds like a great idea.
> 
> (I wasn't there, but I was told.)
> 
> But among the first videos to be uploaded since the
> announcement are two that show some construction
> equipment and both break my browser every time I try
> to watch them. How can this be possible with a fully
> updated Mozilla Firefox 3.5.8 on Ubuntu Linux?
> 
> I suppose something went wrong in the OGG encoding,
> but still, browsers should not be fooled by this,
> and/or Wikimedia Commons needs to make sure videos
> are correctly encoded so they can be safely watched.
> 
> I have asked that these two broken videos be removed,
> http://commons.wikimedia.org/wiki/File:6hpPowerTrowel.ogv
> http://commons.wikimedia.org/wiki/File:13hpBoren.ogv
> 

Does not "break" my browser here. What do you mean with breaks your browser?

j

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Automate uploading files

2009-07-10 Thread Beebe, Mary J
Thank you Chad, this was very helpful.  

If others want to do the same just look at the method is initializeFromUpload() 
within includes/specials/SpecialUpload.php.  Make sure all of those variables 
get set correctly before running processUpload().

Thanks for your help,  

Mary Beebe
 
-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Chad
Sent: Monday, June 29, 2009 1:37 PM
To: Wikimedia developers
Subject: Re: [Wikitech-l] Automate uploading files

On Mon, Jun 29, 2009 at 12:38 PM, Beebe, Mary J wrote:
> I am trying to automate uploading images (or files).  We are currently using 
> mediaWiki version 1.13.2.
>
> I am using the following code:
>            $data=array( 'wpIgnoreWarning'=>'1',
>                'wpDestFile'=>="$bulkImageDirectory/$fname",
>                'wpReUpload'=>'1',
>                'wpUpload'=>'1',
>                'wpUploadFile'=>$fname,
>            );
>            $webrequest=new FauxRequest( $data, true );
>            $uploader=new UploadForm( $webrequest );
>            $uploader->mUploadSize=$zipfile['size'];
>            $uploader->mUploadTempName="$bulkImageDirectory/$fname";
>            $uploader->mOname=$fname;
>            $uploader->mSessionKey='1';
>                        $uploader->processUpload();
>
>  $fname is a relative file name, while ="$bulkImageDirectory/$fname" is the 
> absolute file name.  The file is located in wiki/images/tmp/bulkImages/.
>
> I receive the error: "The file you uploaded seems to be empty. This might be 
> due to a typo in the file name. Please check whether you really want to 
> upload this file." And the upload form displays on the screen with wpDestFile 
> filled in.
>
> Three questions:
> Do you know what the problem is?

Yes, you need to set mFileSize (not mUploadSize, there's no such thing) and
mSrcName (not sure what to, offhand...). I'd suggest taking a look at
internalProcessUpload() and its various return states for more info.

> Is there another way to automatically upload a file within an extension?

What you're doing is the only way (right now) to really do this. It sucks,
yes. Hopefully the new-upload branch will make this sort of thing easier
and less prone to error.

> Is there documentation on this?
>

Not really :\

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposal: switch to HTML 5

2009-07-08 Thread j
David Gerard wrote:
> 
> A method that doesn't say "your browser sucks" but shows it:
> 
> "You are using Safari without XiphQT. Install the Ogg codecs _here_
> for a greatly improved Wikimedia experience."
> "You are using Internet Explorer. Install the Ogg codecs _here_ for a
> greatly improved Wikimedia experience."
> 
> The first linking to XiphQT, the second to Ogg DirectShow.
> 
Internet Explorer does not support the video tag, installing Ogg
DirectShow filters does not help there.

j

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Automate uploading files

2009-06-29 Thread Beebe, Mary J
I am trying to automate uploading images (or files).  We are currently using 
mediaWiki version 1.13.2.

I am using the following code:
$data=array( 'wpIgnoreWarning'=>'1',
'wpDestFile'=>="$bulkImageDirectory/$fname",
'wpReUpload'=>'1',
'wpUpload'=>'1',
'wpUploadFile'=>$fname,
);
$webrequest=new FauxRequest( $data, true );
$uploader=new UploadForm( $webrequest );
$uploader->mUploadSize=$zipfile['size'];
$uploader->mUploadTempName="$bulkImageDirectory/$fname";
$uploader->mOname=$fname;
$uploader->mSessionKey='1';
$uploader->processUpload();

 $fname is a relative file name, while ="$bulkImageDirectory/$fname" is the 
absolute file name.  The file is located in wiki/images/tmp/bulkImages/.

I receive the error: "The file you uploaded seems to be empty. This might be 
due to a typo in the file name. Please check whether you really want to upload 
this file." And the upload form displays on the screen with wpDestFile filled 
in.

Three questions:
Do you know what the problem is?
Is there another way to automatically upload a file within an extension?
Is there documentation on this?

Thanks,

Mary Beebe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Subpage titles

2009-01-04 Thread Michael J. Walsh

On 4 Jan 2009, at 19:03, Aryeh Gregor wrote:

> On Sun, Jan 4, 2009 at 11:02 AM, Michael J. Walsh
>  wrote:
>> This really depends. Books normally have the book title in small
>> print in the corner with the chapter title in bigger print in the
>> centre of the page at the beginning of each chapter.
>
> The nature of books is such that you almost inevitably see the cover
> of the book (which contains the title) before you ever look at the
> inside, so you don't necessarily need reminders of the title.  On the
> other hand, web pages can expect many if not most visitors to reach
> any given page by direct links, skipping any material that might have
> given them an earlier clue to the title.  So I'm not sure print is a
> good analogy for what we should do here.

Ok, i have better examples.

In any hierarchical file system there is a certain amount of logic  
having the filename, "Part 1" displayed in (slightly) bigger font  
than the path, "Principia Mathematica".

Other internet directories typically display the last part in bigger  
print. Open directory do this:
http://www.google.com/Top/Regional/Europe/Italy/



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Subpage titles

2009-01-04 Thread Michael J. Walsh
> I don't think the last part is necessarily what we want to emphasize,
> anyway.  If you have a title like "Principia Mathematica/Part 1", you
> want to emphasize the first part of the title more, surely.  Having
> the name of the work in small print and the section in large print
> seems backwards to me.

This really depends. Books normally have the book title in small  
print in the corner with the chapter title in bigger print in the  
centre of the page at the beginning of each chapter. Not a great  
example  I know. I've written a (very rough and ready) javascript to  
demonstrate how title would look under my proposal here:

http://en.wikisource.org/wiki/User:Blue-Haired_Lawyer/hierarchy.js

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Subpage titles

2008-12-30 Thread Michael J. Walsh

On 30 Dec 2008, at 22:13, Aryeh Gregor wrote:

> On Tue, Dec 30, 2008 at 4:33 PM, Platonides   
> wrote:
>> What about providing " » " as a image with alt="/" ?
>
> 1) That's just horribly ugly when the other solution would be  
> simple enough.
>
> 2) Can we rely on the fact that images are copy-pasted as their alt
> text in all browsers

What about programming the software to replace " » " in links with  
"/" before submitting to the database or previewing the page.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Subpage titles

2008-12-29 Thread Michael J. Walsh
This is a proposal to change how the MediaWiki software displays  
subpages. This isn't really an issue over a Wikipedia because  
subpages in the main namespace are disabled, but using subpages at  
Wikisource is a standard way of dividing up works, leaving only a  
table of contents at the root article.

The problem is that this results in pages titles like this:

United States Code/Title 35/Chapter 14/Section 151

or even

Nicene and Post-Nicene Fathers: Series II/Volume I/Constantine/The  
Life of Constantine/Book II/Chapter 23

which IMHO looks more like a file system than a user-friendly  
website. I would suggest

United States Code » Title 35 » Chapter 14 » Section 151

or my own favourite

United States Code » Title 35 » Chapter 14 »
Section 151

(With "Section 151" in bigger font.)

This would effectively involve moving the subpages div above the  
title and changing the title from the entire path to just the subpage  
name.


See:
http://en.wikisource.org/wiki/Wikisource:Scriptorium#Subpage_formatting
for the discussion I started on Wikisource and further down the same  
page

http://en.wikisource.org/wiki/ 
Wikisource:Scriptorium#Subpage_formatting:_some_more_examples
for some formatted versions of the above examples

and
http://en.wikisource.org/wiki/Nicene_and_Post- 
Nicene_Fathers:_Series_II/Volume_I/Constantine/ 
The_Life_of_Constantine/Book_II/Chapter_23
and
http://en.wikisource.org/wiki/Treaty_on_European_Union/ 
Protocol_on_the_convergence_criteria_referred_to_in_Article_109j_of_the_ 
Treaty_establishing_the_European_Community

for examples of how ugly the current setup can be.

Michael
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l