[Wikitech-l] Templatestyles in refs

2024-05-07 Thread Strainu
Hi folks,

I'm trying to group 2 named references (which are identical as far as
humans are concerned) together. I'm doing it on rowp, but the code is very
close to enwp. One reference is generated from a template via a module
calling Module:Citation/CS1, the other is generated by the same code, but
the calling template is substituted.

The Lua code generating the reference tag is:

frame:extensionTag("ref", refText, { name = citationHash })

(refText is returned by Module:Citation/CS1)

I isolated the problem to the  tag added by
Modul:Citation/CS1 - removing it from inside to outside the 
part solves the issue. More precisely, the parser seems to generate
different strip markers for each invocation, even if the 
content is identical. This problem seems related to the one outlined in
[1].

I have 2 questions related to this:

1. How should I change the Lua code to allow for templatestyles? According
to the Lua reference manual [2], extensionTag is equivalent to a call to
frame:callParserFunction()
<https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#frame:callParserFunction>
with function name '#tag', which suggests to me this should be the correct
invocation wrt to [1].

2. Will a ref with a call to {{citation}} (which is simply a pass-through
to the module) with the exact same parameters as in the module also work?
The use-case is to generate a ref containing the template on substitution
instead of the ton of metadata generated by the module.

Thanks,
   Strainu

[1]
https://www.mediawiki.org/wiki/Help:Cite#Substitution_and_embedded_parser_functions
[2]
https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#frame:extensionTag
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikimedia-l] Re: Outcomes from the March Meeting for the Wikimedia Foundation Board of Trustees

2024-03-27 Thread Strainu
În mie., 27 mar. 2024 la 12:08, Philip Kopetzky 
a scris:

> Hi Asaf!
>
> Yes, thanks for pointing this out! Hopefully with regional networks in
> place this will also create a tighter net to catch these kind of questions
> and make affiliate leaders not feel left alone.
> And maybe the movement charter draft even includes an affiliates model
> that also outlines one possible direction of organisational development for
> affiliates? Guess we'll find out soon!
>

That cannot function independently from the recent Board proposals [2],
though...

[2]
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Affiliates_Strategy/Review

Strainu


>
> Best, Philip
>
> On Mon, 25 Mar 2024 at 16:27, Asaf Bartov  wrote:
>
>> On Fri, Mar 22, 2024 at 5:05 PM Philip Kopetzky <
>> philip.kopet...@gmail.com> wrote:
>>
>>> ** We still have no resources to onboard new affiliates. Once a new one
>>> is approved, they are left alone to fend for themselves.
>>>
>>
>> This is not *quite* true. New affiliates are pointed to this page
>> <https://meta.wikimedia.org/wiki/Resources_for_New_Affiliates>[1], which
>> includes a whole set of links to advice and guidelines for new and smaller
>> affiliates.
>>
>> In my observation, however, few new affiliate leaders take the time to
>> avail themselves of these resources, and given the low traction of these
>> written, on-wiki resources, I can easily see how the impression is that new
>> affiliates are "left alone to fend for themselves", with the next point of
>> contact with the Foundation being if and when they apply for a grant.
>>
>> One reasonable conclusion may be that a more engaging format (an online,
>> video-based course?) might achieve more traction in guiding new affiliates
>> after their recognition.
>>
>>A.
>>
>> [1] https://meta.wikimedia.org/wiki/Resources_for_New_Affiliates
>>
>> Asaf Bartov (he/him/his)
>>
>> Lead Program Officer, Community Development Communities
>>
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>> Imagine a world in which every single human being can freely share in the
>> sum of all knowledge. Help us make it a reality!
>> https://donate.wikimedia.org
>> ___
>> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines
>> at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
>> https://meta.wikimedia.org/wiki/Wikimedia-l
>> Public archives at
>> https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/5K3JO2JFHD2EVCDWVLTJBYKV5AG2KBFH/
>> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
>
> ___
> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines
> at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> https://meta.wikimedia.org/wiki/Wikimedia-l
> Public archives at
> https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/3KRZTAEJ27XDO2IJ3ANKBYSXSQ4A6ET5/
> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/TY2P23QY7OZ2EJGPCPZH354ZEV4BRBER/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Wikimedians of Romania and Moldova annual report

2024-01-26 Thread Strainu
Hey folks,

Here is the annual report for WMROMD. Looking forward for any feedback you
have.

https://meta.wikimedia.org/wiki/Wikimedians_of_Romania_and_Moldova_User_Group/2023

BR,
  Strainu
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/X5ORF6DQ6XCVUQKAOMKLNRZGLM2VRUDK/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikitech-l] "Known languages" or similar?

2024-01-07 Thread Strainu
Hi folks,

I'm trying to add a "translate" link to [[:ro:Template:Ill-wd]] (which
indicates a subject by it's wikidata id) and I need to determine the
original language. Is there a way to determine if the current user
prefers/knows some languages except the wiki's own language? I know I can
use the interface language, but for the vast majority of users that's
identical to the content language.

I vaguely remember that Content Translation asked be at some point about
what languages it should use to provide suggestions, but I can't find that
setting now. Does it still exist and is it available somehow from outside
CX?

Are there any other data sources I can use (and which don't ruin caching,
either)?

Thanks,
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Pywikipedia-bugs] [Maniphest] [Closed] T353086: Create a PagePile generator

2023-12-09 Thread Strainu
Strainu closed this task as "Resolved".

TASK DETAIL
  https://phabricator.wikimedia.org/T353086

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, pywikibot-bugs-list, Strainu, mevo, PotsdamLamb, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, Xqt, jayvdb, 
Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T353086: Create a PagePile generator

2023-12-08 Thread Strainu
Strainu added a comment.


  Already working on this, will put up a patch for review shortly.

TASK DETAIL
  https://phabricator.wikimedia.org/T353086

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, pywikibot-bugs-list, Strainu, mevo, PotsdamLamb, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, Xqt, jayvdb, 
Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Created] T353086: Create a PagePile generator

2023-12-08 Thread Strainu
Strainu created this task.
Strainu added projects: Pywikibot, Pywikibot-pagegenerators.py.
Restricted Application added subscribers: pywikibot-bugs-list, Aklapper.

TASK DESCRIPTION
  **Feature summary** (what you would like to be able to do and where):
  
  Create a generator (and equivalent CLI option) that given a PagePile 
<https://pagepile.toolforge.org/> id, can obtain the corresponding Page objects.
  
  **Use case(s)** (list the steps that you performed to discover that problem, 
and describe the actual underlying problem which you want to solve. Do not 
describe only a solution):
  
  There are situations when I process PagePiles without knowing the site 
upfront. In this case, the `-url` option is not enough, as the text output of 
PagePile does not contain Site information.
  
  **Benefits** (why should this be implemented?): Much shorter to give an id 
rather than a full URL

TASK DETAIL
  https://phabricator.wikimedia.org/T353086

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, pywikibot-bugs-list, Strainu, mevo, PotsdamLamb, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, Xqt, jayvdb, 
Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T352482: APIError: Petscan: No result for source categories

2023-12-03 Thread Strainu
Strainu added a comment.


  Probably a duplicate of https://github.com/magnusmanske/petscan_rs/issues/106

TASK DETAIL
  https://phabricator.wikimedia.org/T352482

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Strainu, Sn1per, Magnus, Xqt, Mpaa, mevo, Aklapper, pywikibot-bugs-list, 
JJMC89, PotsdamLamb, Jyoo1011, JohnsonLee01, SHEKH, Dijkstra, Khutuck, 
Zkhalido, Viztor, Wenyi, Tbscho, MayS, Mdupont, Dvorapa, Altostratus, binbot, 
Avicennasis, mys_721tx, jayvdb, Masti, Alchimista, Krenair
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Wikitech-l] Re: ORES To Lift Wing Migration

2023-09-23 Thread Strainu
Hi folks,

So glad to see the old and new ML teams have an open discussion about this
subject.

I understand that the team might prefer to have several tickets for
different issues, but the discussion about the general approach to the
different models is of interest to many people and is more easily digested
on email. I would suggest to continue discussing the merits of the current
strategy (and not necessarily of a model or another) on email.

* One model per wiki or overall
This is a tough one. :) As a user, I remember how hard it was for Romanian
speakers to complete the training data for damaging/goodfaith and would
prefer to not have to do it again.

However, I'm also worried that some specificities of larger wikis would
creep in the output, leading to reverts that would normally not happen on
my wiki. For instance, smaller settlements are not accepted on enwp, while
they are accepted on rowp. I don't know how to test it myself, and I
haven't seen anything about it in the research.

Another problem I have is I'm not sure how the revert-risk score should be
matched against custom damaging/goodfaith thresholds. Ate there some
guidelines on this except "test"?

* Multiple criteria VS a single score
I think the discussion has been very much about reverts, but as Sj said,
each of these scores are a slightly different facet. Is there data
available on the prevalence of other use-cases or is everyone just writing
revert bots?

On the long run, I believe an unique model good enough can be developed for
revert bots. However, it would be great if there were some clear quality
criteria that the community can verify and the old models are maintained
for a wiki until we are sure the new model passes that criteria on that
wiki.

A change in hosting should not be the guiding force in any team's roadmap,
but the needs of its users.

Have a good weekend,
 Strainu




Pe sâmbătă, 23 septembrie 2023, Luca Toscano  a
scris:
>
>
> On Fri, Sep 22, 2023 at 11:34 PM Aaron Halfaker 
wrote:
>>
>> All fine points.  As you can see, I've filed some phab tasks where I saw
a clear opportunity to do so.
>
> Thanks a lot! We are going to review them next week and decide the next
steps, but we'd like to proceed anyway to migrate ores to ores-legacy on
Monday (this will allow us to free some old nodes that need to be decommed
etc..). Adding features later on to the models on Lift Wing should be
doable, and our goal is to transition away from ores-legacy in a few months
(to avoid maintaining too many systems). The timeline is not yet set in
stone, we'll update this mailing list when the time comes (and we'll follow
up with the remaining users of ores-legacy as well). To summarize: we start
with Ores -> Ores Legacy on Monday, and we'll do Ores Legacy -> Lift Wing
in a second step.
>>
>> >  as mentioned before all the models that currently run on ORES are
available in both ores-legacy and Lift Wing.
>>
>> I thought I read that damaging and goodfaith models are going to be
replaced.  Should I instead read that they are likely to remain available
for the foreseeable future?   When I asked about a community discussion
about the transition from damaging/goodfaith to revertrisk, I was imagining
that many people who use those predictions might have an opinion about them
going away.  E.g. people who use the relevant filters in RecentChanges.
Maybe I missed the discussions about that.
>
> This is a good point, I'll clarify the documentation on Wikitech. Until
models are used we'll not remove them from Lift Wing, but we'll propose to
use Revert Risk where it is suited since it is a model family on which we
decided to invest time and efforts. Basic maintenance will be performed on
the goodfaith/damaging/articlequality/etc.. models on Lift Wing, but we
don't have (at the moment) any bandwidth to guarantee retraining or more
complex workflows on them. This is why we used the term "deprecated" on
Wikitech, but we need to specify what we mean to avoid confusion. Thanks
for the feedback :)
>
>>
>> I haven't seen a mention of the article quality or article topic models
in the docs.  Are those also going to remain available?  I have some user
scripts that use these models and are relatively widely used.  I didn't
notice anyone reaching out. ... So I checked and setting a User-Agent on my
user scripts doesn't actually change the User-Agent.  I've read that you
need to set "Api-User-Agent" instead, but that causes a CORS error when
querying ORES.  I'll file a bug.
>
> Will update the docs as well, as mentioned above we'll keep the current
ORES models available on Lift Wing. Eventually new models will be proposed
by Research and other teams (like Revert Risk), and at that point we (as ML
team) will decide what recommendation to give. Nothing will be removed from
Lift Wing if there are active users on it, but we'll certainly try to
reduce t

[Wikitech-l] Re: ORES To Lift Wing Migration

2023-08-04 Thread Strainu
Hi Chris & ML team,

Good to see LiftWing is finally becoming a reality. There are a few things
in the documentation that I would like to clarify.

1. In [1], the bot owner is encouraged to move to the revertrisk score.
However, in [2], it's explicitly mentioned that the model should not be
used for "Auto-removing edits that a user makes without another editor in
the loop". So, should bot owners currently reverting based on goodfaith and
damaging scores explore the new models? If so, do you have any suggestions
on how to automatically match thresholds between the old and new models?
2. I could not find any reference regarding the ores scores exposed through
other APIs (specifically the RC API [3]). Will those be available going
forward? Under which names?
3. Will it still be possible to (re-)train existing and new model for a
specific wiki? How and when?

Thanks,
  Strainu

[1]
https://wikitech.wikimedia.org/wiki/ORES#Example:_migrating_a_Bot_from_ORES_to_Lift_Wing
[2]
https://meta.wikimedia.org/wiki/Machine_learning_models/Proposed/Language-agnostic_revert_risk#Users_and_uses
[3]
https://ro.wikipedia.org/w/api.php?action=query=json=recentchanges=0%7C4%7C6%7C8%7C10;
*rcprop=*title%7Ctimestamp%7Cids%7C*oresscores*
%7Ctags%7Cpatrolled=unpatrolled=50=edit%7Cnew%7Ccategorize

În joi, 3 aug. 2023 la 17:16, Chris Albon  a scris:

> Hi everybody,
>
> TL;DR We would like users of ORES models to migrate to our new open source
> ML infrastructure, Lift Wing, within the next five months. We are available
> to help you do that, from advice to making code commits. It is important to
> note: All ML models currently accessible on ORES are also currently
> accessible on Lift Wing.
>
> As part of the Machine Learning Modernization Project (
> https://www.mediawiki.org/wiki/Machine_Learning/Modernization), the
> Machine Learning team has deployed a Wikimedia’s new machine learning
> inference infrastructure, called Lift Wing (
> https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing). Lift Wing
> brings a lot of new features such as support for GPU-based models, open
> source LLM hosting, auto-scaling, stability, and ability to host a larger
> number of models.
>
> With the creation of Lift Wing, the team is turning its attention to
> deprecating the current machine learning infrastructure, ORES. ORES served
> us really well over the years, it was a successful project but it came
> before radical changes in technology like Docker, Kubernetes and more
> recently MLOps. The servers that run ORES are at the end of their planned
> lifespan and so to save cost we are going to shut them down in early 2024.
>
> We have outlined a deprecation path on Wikitech (
> https://wikitech.wikimedia.org/wiki/ORES), please read the page if you
> are a maintainer of a tool or code that uses the ORES endpoint
> https://ores.wikimedia.org/). If you have any doubt or if you need
> assistance in migrating to Lift Wing, feel free to contact the ML team via:
>
> - Email: m...@wikimedia.org
> - Phabricator: #Machine-Learning-Team tag
> - IRC (Libera): #wikimedia-ml
>
> The Machine Learning team is available to help projects migrate, from
> offering advice to making code commits. We want to make this as easy as
> possible for folks.
>
> High Level timeline:
>
> **By September 30th 2023: *Infrastructure powering the ORES API endpoint
> will be migrated from ORES to Lift Wing. For users, the API endpoint will
> remain the same, and most users won’t notice any change. Rather just the
> backend services powering the endpoint will change.
>
> Details: We'd like to add a DNS CNAME that points ores.wikimedia.org to
> ores-legacy.wikimedia.org, a new endpoint that offers a almost complete
> replacement of the ORES API calling Lift Wing behind the scenes. In an
> ideal world we'd migrate all tools to Lift Wing before decommissioning the
> infrastructure behind ores.wikimedia.org, but it turned out to be really
> challenging so to avoid disrupting users we chose to implement a transition
> layer/API.
>
> To summarize, if you don't have time to migrate before September to Lift
> Wing, your code/tool should work just fine on ores-legacy.wikimedia.org
> and you'll not have to change a line in your code thanks to the DNS CNAME.
> The ores-legacy endpoint is not a 100% replacement for ores, we removed
> some very old and not used features, so we highly recommend at least test
> the new endpoint for your use case to avoid surprises when we'll make the
> switch. In case you find anything weird, please report it to us using the
> aforementioned channels.
>
> **September to January: *We will be reaching out to every user of ORES we
> can identify and working with them to make the migration process as easy as
> possible.
>
> **By January 2024

[Wikitech-l] Maps as article image?

2023-05-26 Thread Strainu
Hey folks,

The maps and aticle image have been good additions to the multimedia
capabilities in Wikipedia in the last decade, widely used on my home wiki.
For now, in Wikipedia the maps are also rendered as images and become
interactive only when clicked on. This makes them potential candidates for
being displayed as article images in article without a photo.

I would like to find out how complicated it would technically be to make
the article image extension also capable of using map images? I know this
is probably nowhere on the roadmap, I'm only interested in the technical
part of the idea.

Thanks,
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: VisualEditor inserting ``

2023-03-13 Thread Strainu
Hi Robert,

While waiting for VE devs to respond, there are a few things you could do
to narrow-down the issue:
1. Check what the diff window shows (click on Publish changes, then on the
popup "Review your changes" in the lower-left corner).
2. Check if there was actually one or more newline(s) (\n) inserted in
wikitext.In this case, the parser probably tries to simplify the wikitext
unless a hooman decided otherwise (this seems in line with the
"alienInline" you see).

Regards,
  Strainu


În lun., 13 mar. 2023 la 09:37, Robert Vogel via Wikitech-l <
wikitech-l@lists.wikimedia.org> a scris:

> Hi everyone!
> Inspired by
> https://www.mediawiki.org/wiki/VisualEditor/Gadgets#Implementing_a_custom_command
> I was trying to add a `` into the VE using this command:
>
> ```
> ve.init.target.getSurface().getModel().getFragment().insertContent( [ {
> type: 'break' }, { type: '/break' } ] );
> ```
>
> While it actually inserted the line break in visual edit mode, there was
> not `` in the wikitext after saving the page or switching to wikitext
> mode within the edit session.
> I also tried to implement the whole "command/tool" in an extension, but
> the behavior was the same. The odd thing is that a `` inserted  in
> wikitext mode survives the round trip. The linear data model shows it as
> "alienInline" node then.
>
> Any ideas why the official example didn't work for me?
>
> Greetings,
> Robert
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[pywikibot] Re: Text between two comments?

2023-02-03 Thread Strainu
I had a similar issue with updating templates in WLM template lists. This
function might be good inspiration:
https://github.com/rowiki/wikiro/blob/master/robots/python/pwb/monumente/corroborate_monument_data.py#L153

Strainu

Pe vineri, 3 februarie 2023, Roy Smith  a scris:
> Thanks.
> Sadly, I think treating this as flat text will end up being the most
straight-forward way to do it.
>
>
> On Feb 2, 2023, at 7:03 PM, JJMC89  wrote:
> For similar cases, I have used a regex to find the part marked by
comments and then parse the part between.
> START_END  =
re.compile(r"^(.*?)(.*?)(.*)$",
flags=re.I | re.S)
> m = START_END.search(page.text)
> wikicode = mwparserfromhell.parse(m.group(2))
> # do stuff with wikicode
>
> You may be able to do it with the parser.
> # assume start and end represent comment objects you found from
wikicode.filter_comments()
> start_index = wikicode.index(start)
> end_index = wikicode.index(end)
> inside = wikicode.nodes[start_index:end_index]
>
> On Thu, Feb 2, 2023 at 3:39 PM Roy Smith  wrote:
>>
>> I'm trying to parse DYK prep area templates, for example Template:Did
you know/Preparation area 3.  Unfortunately, these are more like flat text
files than any kind of nicely structured data.  The stuff of interest is
everything between two HTML comments:
>>
>> 
>> {{main page image/DYK|image=Melissa Ong.webp|caption=Selfie of Ong,
commonly replicated by the Step Chickens}}
>> * ... that "Step Chickens" on TikTok replace their profile pictures with
an image ''(shown)'' of '''[[Melissa Ong]]''', whom they call "Mother Hen"?
>> * ... that '''[[interfaith greetings in Indonesia]]''' include phrases
from Islam, Christianity, Hinduism, Buddhism, and Confucianism?
>> * ... that '''[[Kimmo Leinonen]]''' helped establish both the [[Finnish
Hockey Hall of Fame]] and the [[IIHF Hall of Fame]]?
>> * ... that the [[Pulitzer Prize for Fiction|Pulitzer Prize]]-winning
novel '[[All the Light We Cannot See]]' contains a sympathetic
[[Nazism|Nazi]]?
>> * ... that a {{Convert|10|ft|m|adj=mid|-tall|0}} '''[[Lady
Rainier|statue of a woman]]''' in [[Seattle]] was commissioned by a local
brewery in 1903?
>> * ... that ...
>> * ... that prior to entering politics, '''[[Herbert Salvatierra]]''' led
a troupe of [[carnival]] ''[[comparsa]]s''?
>> * ... that [[Winston Churchill]] published '''[[Are There Men on the
Moon?|an essay on extraterrestrial life]]''' during the Second World War?
>> 
>>
>> I can find the comments with Wikicode.filter_comments().  But once I've
found the two delimiting comments, how do I grab the text between them?  Or
is the parser the wrong tool?  Would I do better to treat the content of
the page as flat text and just iterate over it line by line, teasing it
apart with regexes?
>>
>> ___
>> pywikibot mailing list -- pywikibot@lists.wikimedia.org
>> Public archives at
https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/XA2Y2ZFSFSLRG5TWHIV5G3QRMAK27H56/
>> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org
>
> ___
> pywikibot mailing list -- pywikibot@lists.wikimedia.org
> Public archives at
https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/4ABOPXJMDIQ7WRBUTI7KTYYE7MKQ6W2U/
> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org
>
>
___
pywikibot mailing list -- pywikibot@lists.wikimedia.org
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/pywikibot@lists.wikimedia.org/message/A5OPN22WDIBXZ3CHKJMIEXWGTNGZTL4Q/
To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org


[Wikimedia-l] Wikimedians of Romania and Moldovan annual report (2022)

2023-01-10 Thread Strainu
Hi folks,

The annual report of the Wikimedians of Romania and Moldova User Group
for the last year is available at
https://meta.wikimedia.org/wiki/Wikimedians_of_Romania_and_Moldova_User_Group/2022

As usual, you can leave feedback on the list or in the report talk page.

Best regards,
  Strainu
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/NVPNEZMRZ6QF6EWUEUFUL3IIIGOHHUN2/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org


[Wikimedia-l] Re: Recent press around December Office Action

2023-01-08 Thread Strainu
Pe duminică, 8 ianuarie 2023, Amir Sarabadani  a scris:
> Maybe I'm missing something obvious. Feel free to correct me if I'm
wrong, privately or publicly.

I would be very surprised (and worried, mind you!) to find out that the WMF
has the data needed to *reliably* link users and organizations for
state-sponsored entities. The WMF is actively minimizing the amount of data
it gathers on visitors and limits it to their own sites, while commercial
companies are very creative in finding new ways to track their users
without their consent between various media and domains.

Lacking that reliable identification, making statements related to state
affiliation is almost certainly exposing the foundation to serious legal
liability.

That doesn't mean that I disagree with the idea that this has been poorly
communicated; nitpicking on jargon is not helping send the right message
out. At the very least internal communication should explain that state
"infiltration" is always a risk, regardless of country and project
(remember the case on French Wikipedia a few years ago).

Strainu
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/4QLXGY3FLLJR5YSSNE4XTWV6DFTIHCG6/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Pywikipedia-bugs] [Maniphest] [Commented On] T325473: Add support for Extension:PageViewInfo

2022-12-20 Thread Strainu
Strainu added a comment.


  Let's clarify the terminology and separate the concerns here:
  
  - I suggested that, based on the usecase, pageview data might be better 
retrieved from the **Wikimedia.org**  REST API ( 
https://wikimedia.org/api/rest_v1/#/ ) which is **different** from the REST API 
on the other sites (see here 
<https://www.mediawiki.org/wiki/Wikimedia_REST_API>). As usual, extensions can 
add their own endpoints (see for instance `/pages/talk` at 
https://en.wikipedia.org/api/rest_v1/ which is missing from MediaWiki 
Confusing, I know.
  - Pageviews are only part of the Wikimedia API, so it is not exposed through 
the `PageViewInfo` extension. From a design point of view, they're totally 
separated and could potentially return different data at least occasionally 
(although I suppose they share a common data source)
  
  I think all three are valuable (local API, Wikimedia REST, MediaWiki REST) 
and should all have wrappers in PWB:
  
  1. PWB should have a `RestAPISite` class and potentially subclasses per 
family (i.e. `WikipediaRestApi`, `CommonsRestApi`,`WikidataRestApi` etc.) + a 
`WikimediaOrgRestApi`. Exact naming is TBD, of course.
  2. to expose pageviews data in the `Page` object the local API should be 
used, just like it's the case for `page_image()` or `coordinates()`. Use of 
REST API should be at most a backup, but I would personally leave the 
implementation of the backup mechanism to the bot builders.

TASK DETAIL
  https://phabricator.wikimedia.org/T325473

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Strainu, Aklapper, pywikibot-bugs-list, binbot, Xqt, PotsdamLamb, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[pywikibot] Re: Page views

2022-12-18 Thread Strainu
Hey,

You're talking about https://m.mediawiki.org/wiki/Extension:PageViewInfo

AFAIK pwb does not have a high level api for this, but I'm sure you can get
the data using the data API or requests directly.
https://doc.wikimedia.org/pywikibot/master/api_ref/pywikibot.data.html#module-data.api

One thing to keep in mind is that, depending on your use case, you might
prefer the Wikimedia pageviews api
https://wikimedia.org/api/rest_v1/#/Pageviews%20data This gives you some
aggregations on the same data vs the individual data you get from the
extension.

Strainu



Pe duminică, 18 decembrie 2022, Bináris  a scris:
> Click
https://hu.wikipedia.org/w/index.php?title=Erfurti_latrinabaleset=info
> Ctrl F "Megtekintések száma az elmúlt 30 napban"
> Click on the number next to it
> Gives pop-up diagram
>
___
pywikibot mailing list -- pywikibot@lists.wikimedia.org
To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org


[Wikitech-l] Re: Filtered lists with checkboxes

2022-10-26 Thread Strainu
Pe marți, 25 octombrie 2022, Bináris  a scris:
>
>
> Strainu  ezt írta (időpont: 2022. okt. 25., K,
19:20):
>>
>> If you're ok with editing a list of titles, petscan [1] is all you need.
>
> Thank you!
> Unfortunately, this does not run in native home wiki, so it lacks the
advantage of seeing the article preview when I push my mouse over the
title, but anyway, it is useful!

Not sure about page pop-ups, but Wikidata label and description added to
the output seems like an easy improvement. Patches are welcome:
https://github.com/magnusmanske/petscan_rs/pulls?q=is%3Apr+is%3Aclosed

:)
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Filtered lists with checkboxes

2022-10-25 Thread Strainu
Pe marți, 25 octombrie 2022, Bináris  a scris:
> Thank you! " editing the wikipage and deleting the unwanted titles" --
yes, this is just what I want, The list may be on a temporary page, and
deleting the unwanted is even better then first marking it. Then I can use
this page as a source for the bot.
> Is there a list of these tools? I am not familiar with Toolforge, how to
find a tool by purpose.

If you're ok with editing a list of titles, petscan [1] is all you need.

If you never used it, here is how I would do it:

1. Generate the initial list of articles: on the "Categories" tab, select
your wiki and your category. On the output tab, select Format plain text.
This will get you the list.

2. Prepare the workspace: After you refresh (in order to reset all fields),
go to the "Other sources" tab and input your list of files in the "Manual
list" field, then just below add the wiki and click on "Do it!". A list of
results will appear, along with a number called psid, which uniquely
identifies the list. For example: "PSID is 23121189". Copy the link and
send them to your users.

3. Each user can then edit the article list, click on "Do it!" and give you
their psid.

4. (Optional) If you want to do more complex operations on the results, set
the output to PagePile and ask the users for the PagePile ID (or the url
they are redirected to when clicking Do it). Then use
https://pagepile.toolforge.org/?menu=filter to combine them.

Now, pywikibot does have some support for petscan, but I believe it does
not include psid. However, it's trivial to scrape the output of the text
output.

Good luck!

Strainu

[1] https://petscan.wmflabs.org/


>
> Strainu  ezt írta (időpont: 2022. okt. 25., K, 1:10):
>>
>> There are several tools working with PagePiles that can achieve the same
result, but they are all basically equivalent to editing the wikipage and
deleting the unwanted titles, which doesn't seem to be what you want.
>
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Filtered lists with checkboxes

2022-10-24 Thread Strainu
Pe luni, 24 octombrie 2022, Bináris  a scris:
> Hi,
[...]
> By that time, do you know about such service on the tolserver? Or can I
do it myself somehow with Lua?

Hi Binaris,

For Commons files, there is https://pagepile-visual-filter.toolforge.org/
It shouldn't be too complicated to extend it to any list, but it's not
there yet. Maybe ask the maintainer for an extended version?

There are several tools working with PagePiles that can achieve the same
result, but they are all basically equivalent to editing the wikipage and
deleting the unwanted titles, which doesn't seem to be what you want.

HTH,
 Strainu


>
> --
> Bináris
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: TDF is looking for community representatives

2022-10-14 Thread Strainu
Erica,

There are a lot of emails on this list and Wikimedia-l starting with "you
can find translations of this announcement on meta". I think this is a very
effective way to indicate translations where needed, while keeping the
announcement in a single place. Sending us to a link feels like the catchy
press titles: "you won't believe what's happening! Click here to find out!"

Second, as much as we want to be multilingual, participating in the
technical community without some command of English is basically
impossible. In that respect, the technical audience is not the same as a
general Wikimedia audience.

My 2c,
 Strainu

Pe luni, 10 octombrie 2022, Erica Litrenta  a
scris:
> (Sorry to "hijack" the thread, I am not personally involved in TDF but
since I was the original "messenger", I'm interested in learning more about
Daniel's POV.
> The original email linked to
https://www.mediawiki.org/wiki/Technical_decision_making/Community_representation
.
> While I'm well aware that info a click away is not optimal, I'm
definitely more against walls of text that may be hard to understand for
non-native readers.
> That page was marked for translation instead, and among other things, it
offered exactly the process you are describing, and the second email asked
specifically for recommendations.
> We had /also/ asked for recs to a few dozens colleagues, and none of the
people pinged gave their availability.
> Interested to hear what could have been done differently.)
> On Thu, Oct 6, 2022 at 1:06 PM Daniel Kinzler 
wrote:
>>
>> Am 06.10.2022 um 08:52 schrieb Linh Nguyen:
>>
>> Kunal,
>> I hear you but we only have 3 people who actually put the effort into
applying for the position.  We are appointing people who are at least
trying to help.  If you want to help in the process please feel free to put
your name on the list.
>>
>> The original mail doesn't really make it clear what impact one might
have by joining, or what would be expected of a member. Asking people to
click a link for details loses most of the audience already.
>>
>> One thing that has worked pretty well in the past when we were looking
for people to join TechCom was to ask for nominations, rather than
volunteers. We'd then reach out to the people who were nominated, and asked
them if they were interested.  Self-nominations were of course also fine.
>>
>> Another thing that might work is to directly approach active volunteer
contributors to production code. There really aren't so many really active
ones. Ten, maybe.
>>
>> --
>> Daniel Kinzler
>> Principal Software Engineer, Platform Engineering
>> Wikimedia Foundation
>>
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
> --
>
> 
>
> <
https://lh4.googleusercontent.com/t1GetqH3N05ZDv75_-Q6W0YEm4ofn22ZQVNUIoPTIa-ruOTtteTbCweEL9so7ibpyWciFTgOyeDjTRDNr7bhQtxRjFucqJcb7cFnXUqpcqkBsTGqxZRdpmCCzx5xnCYOks-0sAej
>
>
> Erica Litrenta (she/her)
>
> Senior Manager, Community Relations Specialists (Product)
>
> Wikimedia Foundation
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[jira] [Commented] (LOG4J2-3614) Update documentation after LOG4J2-3075

2022-10-07 Thread Strainu (Jira)


[ 
https://issues.apache.org/jira/browse/LOG4J2-3614?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17614084#comment-17614084
 ] 

Strainu commented on LOG4J2-3614:
-

Note that regardless of the fact that this was just a bugfix and not a new 
feature, I still think a nanosecond example in table 2 of 
[https://logging.apache.org/log4j/2.x/manual/json-template-layout.html] would 
be welcomed.

> Update documentation after LOG4J2-3075
> --
>
> Key: LOG4J2-3614
> URL: https://issues.apache.org/jira/browse/LOG4J2-3614
> Project: Log4j 2
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 2.17.2
>Reporter: Strainu
>Assignee: Volkan Yazici
>Priority: Minor
>
> After LOG4J2-3075 was implemented, the JsonTemplateLayout timestamp resolver 
> documentation should be updated to reflect the new feature.
> It's especially important to do so since the second fragment identifier used 
> here ('S') is different from the one described in, e.g. 
> [https://logging.apache.org/log4j/2.x/manual/layouts.html] (which is 'n'). I 
> don't have enough experience with Log4j to determine if this inconsistency is 
> a bug or not, but an entry in the JsonTemplateLayout  documentation would 
> help.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (LOG4J2-3614) Update documentation after LOG4J2-3075

2022-10-07 Thread Strainu (Jira)


[ 
https://issues.apache.org/jira/browse/LOG4J2-3614?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17614083#comment-17614083
 ] 

Strainu commented on LOG4J2-3614:
-

Sure, here are the extracts from the txt and json logging configs in one 
project. Both aim to display the same timestamp

 

TXT log (working):
{code:java}


 {code}
JSON log:
{code:java}

{code}
 

log-template.json (working):
{code:java}
{
  "time": {
  "$resolver": "timestamp",
  "pattern": {
"format": "HH:mm:ss.S",
"timezone": "UTC"
  }
  }
} {code}
 

log-template.json (NOT working):
{code:java}
{
  "time": {
  "$resolver": "timestamp",
  "pattern": {
"format": "HH:mm:ss.n",
"timezone": "UTC"
  }
  }
} {code}
Notice how in the json config I had to use _S_ to display nanoseconds, 
while in text I used {_}n{_}.

> Update documentation after LOG4J2-3075
> --
>
> Key: LOG4J2-3614
> URL: https://issues.apache.org/jira/browse/LOG4J2-3614
> Project: Log4j 2
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 2.17.2
>Reporter: Strainu
>Assignee: Volkan Yazici
>Priority: Minor
>
> After LOG4J2-3075 was implemented, the JsonTemplateLayout timestamp resolver 
> documentation should be updated to reflect the new feature.
> It's especially important to do so since the second fragment identifier used 
> here ('S') is different from the one described in, e.g. 
> [https://logging.apache.org/log4j/2.x/manual/layouts.html] (which is 'n'). I 
> don't have enough experience with Log4j to determine if this inconsistency is 
> a bug or not, but an entry in the JsonTemplateLayout  documentation would 
> help.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (LOG4J2-3614) Update documentation after LOG4J2-3075

2022-10-06 Thread Strainu (Jira)
Strainu created LOG4J2-3614:
---

 Summary: Update documentation after LOG4J2-3075
 Key: LOG4J2-3614
 URL: https://issues.apache.org/jira/browse/LOG4J2-3614
 Project: Log4j 2
  Issue Type: Bug
  Components: Documentation
Affects Versions: 2.17.2
Reporter: Strainu


After LOG4J2-3075 was implemented, the JsonTemplateLayout timestamp resolver 
documentation should be updated to reflect the new feature.

It's especially important to do so since the second fragment identifier used 
here ('S') is different from the one described in, e.g. 
[https://logging.apache.org/log4j/2.x/manual/layouts.html] (which is 'n'). I 
don't have enough experience with Log4j to determine if this inconsistency is a 
bug or not, but an entry in the JsonTemplateLayout  documentation would help.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[Wikitech-l] A random thank you to the Wikimedia tech community

2022-08-30 Thread Strainu
Hi all,

With the risk of being off-topic, I want to express my gratitude to
all the members of the Wikimedia tech community for being such a
supportive and helpful group! Not only here, but on all communication
channels.

It's been years since one of my questions (most of which could be
classified as obscure) has gone unanswered. Also, I recently had to go
through all my emails since June and I noticed that except for a few
announcements and obvious spam, all other other threads had at least
one answer. For me, this is the sign of a great community to be in.

Thanks again and keep up the good work!
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: 3D models in Commons

2022-07-03 Thread Strainu
Pe duminică, 3 iulie 2022, Derk-Jan Hartman 
a scris:
> You mean 3d models besides the type we already support ?
> https://www.mediawiki.org/wiki/Extension:3D

Yes, specifically the formats that support textures. There is a ticket list
in phab: https://phabricator.wikimedia.org/maniphest/query/ZgJlL4Jm.OCj/#R

Strainu
>
> On 3 Jul 2022, at 11:49, Strainu  wrote:
> Hey folks,
>
> I know it's a bit early in the fiscal year and that's probably why I
can't find anything on wiki, but I understood that the new yearly plan puts
a lot of emphasis on the multimedia features. Does that include making 3D
models usable in our wikis? If there are planned projects related to that
in this fiscal year, would it be possible to get a link to the project page?
>
> I'm asking because there are a lot of cool, freely licensed models out
there just waiting to be imported...
>
> Thank you,
>  Strainu ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] 3D models in Commons

2022-07-03 Thread Strainu
Hey folks,

I know it's a bit early in the fiscal year and that's probably why I can't
find anything on wiki, but I understood that the new yearly plan puts a lot
of emphasis on the multimedia features. Does that include making 3D models
usable in our wikis? If there are planned projects related to that in this
fiscal year, would it be possible to get a link to the project page?

I'm asking because there are a lot of cool, freely licensed models out
there just waiting to be imported...

Thank you,
 Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: namespace names vs interlanguage links

2022-06-05 Thread Strainu
Amir,

In Romanian, this kind of possible, but highly unlikely problem is called
"drob de sare" (salt stone). I'll let you use your language skills to find
out why :)

Let the community be and find their own ways to deal with the problem, if
it ever becomes a real one. If one really wants to link to the sanskrit
Wikipedia, it can do so using [[:w:sa:...]]. Bugs in Pwb can be solved if
you log them - but does Tyap has any robots today?

No need to have users in a new language write in English just because some
problems might occur in certain very particular scenarios.

My 2c,
 Andrei

Pe sâmbătă, 4 iunie 2022, Amir E. Aharoni  a
scris:
> Hi,
> I've recently discovered that namespace names may have an ambiguity with
interlanguage links: If a namespace name is the same as a language code,
using it in wikitext poses all kinds of challenges.
> Actual example: In the Tyap language (code kcg), the Wikipedia in which
was created a few days ago, the Category namespace is called "Sa:", which
is also the language code and, hence, the interlanguage link code for
Sanskrit.
> So, "Sa" is usable in wikitext, but has all kinds of little issues. For
example, old-style non-Wikidata interlanguage links to Sanskrit from the
Tyap Wikipedia are probably impossible. They are not very likely to be
inserted into articles, but still, it's somewhat conceivable. I also
noticed that it confuses Pywikibot in some ways. And I can imagine other
subtle bugs that it will cause.
> I've asked Tyap speakers whether it's possible to change the word for
"Category" to something else. No—they want to use "Sa". It's legitimate not
to want to change the word for a technical reason.
> So what can be done?
> The editors there told me that it's OK for them to use "[[Category:" in
wikitext, but they would like to see "Sa:" in the title of category pages.
I'm not sure that it's possible: as far as I know, the namespace name
definition in MessagesKcg.php will be used for both things, and if Visual
editor is used to add categories, it will add "[[Sa:". Bots or gadgets can
be used to replace it to "Category", but is looks like an ugly hack.
> Does anyone have better ideas for a robust, comprehensive solution?
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[pywikibot] Adding text in predefined locations

2022-05-04 Thread Strainu
Hello fellow bot writers,

I have a problem which must have been solve a million times before,
but I can't find a full solution. I need to add sections to a talk
page in various locations (depending on the page):
# at the top of the page
# before the first section (i.e. at the top, but after some header)
# at the end of the page, before categories etc.

The first case is trivial and I know about pywikibot.textlib.add_text
which covers the third case, but there doesn't seem to be a ready-made
solution for #2. Do you know any?

Thank you,
  Strainu
___
pywikibot mailing list -- pywikibot@lists.wikimedia.org
To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org


[Wikimedia-l] Re: Collection / Special:Book usage

2022-04-17 Thread Strainu
Pe duminică, 17 aprilie 2022, Tito Dutta  a scris:
> Hello,
> This was a very useful tool for the readers. I used it a lot when it was
working.
> User namespace books page category shows 50,000 subpages:
https://en.wikipedia.org/wiki/Category:User_namespace_book_pages
> (Please see language sidebar for other languages)

You could probably go though all the pages in all the equivalent categories
and have a histogram of usage based on page creation time.

Strainu

> Regards.
>
>
> রবি, ১৭ এপ্রিল, ২০২২ তারিখে ৯:১৮ PM টায় তারিখে Galder Gonzalez Larrañaga <
galder...@hotmail.com> লিখেছেন:
>>
>> No one or very few use it, because you can't save a book. I had some
teachers in our university courses who used it to download what their
students did, but since the WMF decided to break it, evidently they are not
using it anymore. I repeat: it worked and it was broken in purpose. So now
we have an option to create a book but no actual book can be created,
besides printing it with PediaPress.
>>
>> 2022(e)ko api. 17(a) 09:59 erabiltzaileak hau idatzi du ("Amir E.
Aharoni" ):
>>
>> > On Sun, Apr 17, 2022, 09:29 Strainu  wrote:
>> > >
>> > > The correct question is: does it still do anything of value?
>> > ‫בתאריך יום א׳, 17 באפר׳ 2022 ב-10:42 מאת ‪Jan Ainali‬‏ <‪
ainali@gmail.com‬‏>:‬
>> >
>> > Even with all output options broken it is still a decent user
interface for creating and organizing collections of articles.
>>
>> This may well be true, but I'm wondering how much is it *actually* used.
I know I never use it, but it's possible that thousand of other people do.
If it's true, then everything is fine. I can't find a log of its usage, or
a statistics page that shows how often do people use this feature.
>> It currently appears in at least two prominent places:
>> 1. "Create a book" link in the desktop sidebar (in some wikis; I don't
see it in the English Wikipedia, but I do see it in Swedish and Basque).
>> 2. "Extensions used by Wikimedia - Main" group in translatewiki.net,
which means that volunteer localizers are asked to translate it with
(relatively) high priority.
>>
>> If only, say, five people use it in the whole Wikimedia universe, then
perhaps someone should consider downgrading its prominence or maybe
removing it entirely.
>>
>> On translatewiki, I can move it from "Extensions used by Wikimedia -
Main" to "Extensions used by Wikimedia - Advanced" or even to "Extensions
used by Wikimedia - Legacy", but again, before I do this, I'd like to make
sure that it's not actually used by a lot of people.
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>>
>> ___
>> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines
at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
>> Public archives at
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/ZH47KTOZZA24W5OJN4Z7KJPNQ7ET646J/
>> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/T3KH2SJ7YTZDYOYPKZA6UVADASR2PZGB/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: Collection / Special:Book usage

2022-04-17 Thread Strainu
The correct question is: does it still do anything of value?

Here is what I see on rowiki: Due to severe issues with our existing
system, the Book Creator will no longer support saving a book as a PDF.

There is also a link to PediaPress which doesn't seem to work and I can't
choose any output format.

Regards,
 Strainu

Pe sâmbătă, 16 aprilie 2022, Amir E. Aharoni 
a scris:
> Hi,
> As far as I can see, the Collection extension, which provides the
Special:Book page, is deployed on nearly all Wikimedia wikis.
>
> Is there data that shows how often do people actually use it?
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/W5SSX7AKQR3U4ULYM4P6MVOUGJHVQIOY/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikitech-l] Re: Different cache invalidation rules for similar users?

2022-04-06 Thread Strainu
În mie., 6 apr. 2022 la 22:55, Krinkle  a scris:
>
> On Mon, 4 Apr 2022, at 10:12, Strainu wrote:
>
> Thank you for your responses folks. The script is a gaget [1], loaded
> and unloaded through the preferences.
>
> Regards,
>Strainu
>
> [1] https://ro.wikipedia.org/wiki/MediaWiki:Gadget-wikidata-description.js
>
>
> This page has a history of two revisions, both 25 Mar, about 10 minutes apart.
>
> Is the reported issue that its last edit [1] was seemingly not applied for 
> some editors? E.g. they kept getting the previous version with the 
> getElementByID error?

No, the issue is that the users would disable the gadget and it would
still be enabled after 12h.

Strainu

>
> -- Krinkle
>
> [1] 
> https://ro.wikipedia.org/w/index.php?title=MediaWiki%3AGadget-wikidata-description.js=revision=14854456=14854443
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Different cache invalidation rules for similar users?

2022-04-04 Thread Strainu
Thank you for your responses folks. The script is a gaget [1], loaded
and unloaded through the preferences.

Regards,
   Strainu

[1] https://ro.wikipedia.org/wiki/MediaWiki:Gadget-wikidata-description.js

În lun., 4 apr. 2022 la 04:20, Krinkle  a scris:
>
> On Sun, 3 Apr 2022, at 17:57, Strainu wrote:
>
> Hi,
>
> I've recently seen some complaints from 2 users located in the same country 
> that it takes about half a day for the Javascript changes to propagate. Users 
> from different countries but similar user rights don't seem to have this 
> problem.
>
> Is it possible to have different cache invalidation rules for different 
> countries? If not, what else could cause this behavior?
>
>
> It depends on what kind of changes and to what piece of JavaScript code.
>
> My guess would be that this is a change not to deployed software or gadgets 
> or site scripts, but a user script. And that the user script is loaded by URL 
> via importScriptURI or mw.loader.load. And that the URL is non-standard (e.g. 
> not exactly /w/index.php?title=..=raw=text/javascript, but with 
> other parameters or different order or different encoding). This means that 
> it is not purged on edits.
>
> In that case, it will stay cached. It might then be that someone near one 
> data center is lucky that the URL is not used there before and sees no cache. 
> Or that near another data center the URL is not popular enough to stay in the 
> CDN and thus falls out before the 7 day expiry despite no observed edit or 
> purge.
>
> To know for sure, I would need to see the specific script edit and how the 
> script is loaded.
>
> -- Krinkle
>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Different cache invalidation rules for similar users?

2022-04-03 Thread Strainu
Hi,

I've recently seen some complaints from 2 users located in the same country
that it takes about half a day for the Javascript changes to propagate.
Users from different countries but similar user rights don't seem to have
this problem.

Is it possible to have different cache invalidation rules for different
countries? If not, what else could cause this behavior?

Thanks,
  Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikimedia-l] Re: Form 990 clarification request (for the attention of WMF accounts staff)

2022-03-05 Thread Strainu
Andreas, I understand this email won't address your more serious questions,
but I believe it's fair to point out that the average salary will tell you
nothing relevant. Without drilling down on job family, your results will be
skewed by outliers.

I can name out of the top of my head 10 people working at the foundation in
2019 which I believe could get half a million dollar offers from software
companies in the bay area (that's $500.000 per year before tax). While it's
likely the Foundation doesn't pay this much, they're probably not paying at
50% discount either.

It's also worth asking if the salary costs include other type of
compensation,such as visa support or relocation costs.

Also, maybe a lawyer can answer some of the questions the WMF won't answer,
as they are familiar with form 990 and the "tricks" of filling it.

Strainu


Pe joi, 3 martie 2022, Andreas Kolbe  a scris:
> Dear all,
>
> To bring some sort of closure to this thread about Wikimedia salary
costs, Wikimedia CEO Maryana Iskander did eventually post a response on
Meta.[1] My question and her reply are copied in full below.
>
> What please was the 2019 salary cost per WMF employee, per the most
recent Wikimedia Foundation Form 990?
>
> According to the linked Form 990, the WMF had salary costs of $55,634,913
(page 1, line 15, "Salaries, other compensation, employee benefits") in
2019, and a total of 291 employees (page 1, line 5). On the face of it,
this makes for an average salary cost of over $191K per employee.
>
> Is this the correct figure, or if not, what is the correct calculation
for the average salary cost per employee in 2019? Are there estimates for
more recent years? Thanks, --Andreas JN466 01:04, 17 February 2022
(UTC)[reply]
>
> Hi Andreas - I am six weeks into the job and have seen your questions
about salaries at the Wikimedia Foundation in various public forums. I
would like to try and give you a response. What interests me most is
understanding the motivations for your questions so that I can attempt to
share appropriate information. You are welcome to contact me directly at
miskander<
https://ci5.googleusercontent.com/proxy/jiZCnbhuUZi7Hpf8KqsEHXt0SN6owx0bjebYerkxpmxJAXQeLwDEUXJlkBSND1k_KKAT_HGgJax2rAHcxuDhk9LayaxRvh0yA7OWGh0_Mq0X6vLvKVNkvEHfGZfYjYYcQ1q3ItszeGDk_4Vb5Q=s0-d-e1-ft#https://upload.wikimedia.org/wikipedia/commons/thumb/8/88/At_sign.svg/16px-At_sign.svg.png
>wikimedia.org for a conversation as I won’t respond further here.What I
can share is the following:Calculating an average salary based on the Form
990 is highly misleading. It produces totals that match our highest-paid
employees, as you see on the 990 form. This is true of many organisations,
not only the Wikimedia Foundation. As we will not release non-public salary
information in public forums, we accept that this number is much higher
than the true average salary. We currently have over 500 staff all over the
world that are in a wide variety of job types and levels, each of which are
paid differently and by location. An average is difficult to calculate and
while it may provide a data point, it lacks meaning for evaluating our
performance as an organisation. An average salary cost, even based on
non-public data, is not useful for most of the issues that concern me most.
We hire in over 50 countries, which is a reflection of our values as a
global movement, but introduces complexity in ensuring we can offer
competitive packages that will attract mission-driven talent, and
especially engineers who we need to support the technology obligations of
the Foundation. People are the biggest investment we make in supporting the
Wikimedia projects and community, so this is a topic of critical importance
to me. Finally, I have also checked that we are in line with other open
knowledge organisations (e.g., Mozilla, Creative Commons, EFF) in the
financial, salary, budget, and staff information that we
publish. MIskander-WMF (talk) 14:54, 17 February 2022 (UTC)[reply]
> I'll just leave some general comments on Maryana's response here.
> 1. An organisation committed to transparency shouldn't give a friendly or
beholden inquirer any different information than a hostile one in response
to questions of fact. In both cases, the information should simply be
accurate. I have no desire to ingratiate myself.
> 2. As for my motivation, it's surely one that any Wikipedian can relate
to: I would like the public to have access to accurate information. I
sometimes write about these topics[2][3] and assist journalists with
related research.
> 3. I don't accept that calculating an average for 291 employees produces
a figure that matches "our highest-paid employees". On the contrary, it
produces a figure for ALL "employees" in the strict sense of the word
(excluding freelancers). Even factoring in freelancers, the 291 employees
listed on the Form 990 were by far the majority of the total n

[Pywikipedia-bugs] [Maniphest] [Commented On] T301908: Drop support for Python 3.5

2022-02-24 Thread Strainu
Strainu added a comment.


  In T301908#7734951 <https://phabricator.wikimedia.org/T301908#7734951>, @Xqt 
wrote:
  
  > In T301908#7734698 <https://phabricator.wikimedia.org/T301908#7734698>, 
@Strainu wrote:
  >
  >> I continue to believe we need to support a version at least a year after 
Cloud Services have moved away from it, as described in 
https://lists.wikimedia.org/hyperkitty/list/pywiki...@lists.wikimedia.org/message/BSTDB6JYJ74DE3BTNWND4BAROLIXJTW5/
  >>
  >> That would mean mid-2023 if I understood correctly the latest news about 
the Debian upgrades.
  >
  > If I follow this your proposal:
  >
  > 1. two years after official support has ended (e.g. 7 years after launch) 
AND
  > 2. one year after Toolforge moved to a newer Python version. AND
  > 3. the percentage of users of a version goes under 5%
  >
  > 1. The official support has ended Sept. 2020; two additional years will be 
Sept 2022. Dropping Python 3.5 is not planed for an earlier date.
  > 2. Toolforge migration has already started.
  > 3. The percentage of users is already below 0,3 % except for Toolforge but 
this will decrease rapidly if the Python base is changed.
  >
  > I don't want to fight for a few weeks or months. Release Pywikibot 7 was 
also postponed for few months. But I want to promote a change as early as even 
possible but this is also a long term process so that it can be planned.
  
  OK, I checked the announcement 
<https://lists.wikimedia.org/hyperkitty/list/wikitec...@lists.wikimedia.org/message/EPJFISC52T7OOEFH5YYMZNL57O4VGSPR/>
 and it seems that the final shutdown of Stretch is planned to happen around 
June (although I suspect it will get delayed a little, but probably not until 
October). Let's get some data after that happens and see if there is anyone 
still using 3.5 from the Cloud Services (which would mean they explicitly need 
it) and if not then we can keep your proposed date.
  
  If there are still 3.5 users after June, we can still keep the date, but we 
should arrange some kind of notification (not sure who and in what 
circumstances can do that).
  
  How does that sound?
  
  > Please take into account that other packages that are mandatory for 
Pywikibot need Python 3.6 already; for example the central http interface 
`requests` has dropped Python 3.5 support 7 months ago and Python 3.6 support 
will be dropped this year.
  
  It's not immediately obvious to me what's the concern here. What's forcing us 
to increase the required requests version?

TASK DETAIL
  https://phabricator.wikimedia.org/T301908

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Majavah, Strainu, Mpaa, Framawiki, Dvorapa, Dalba, Meno25, valhallasw, 
Multichill, Rubin16, Basilicofresco, kscanne, Larske, Lee, Huji, Salween, 
JJMC89, Legoktm, matej_suchanek, Aklapper, Xqt, pywikibot-bugs-list, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T301908: Drop support for Python 3.5

2022-02-24 Thread Strainu
Strainu added a comment.


  I continue to believe we need to support a version at least a year after 
Cloud Services have moved away from it, as described in 
https://lists.wikimedia.org/hyperkitty/list/pywiki...@lists.wikimedia.org/message/BSTDB6JYJ74DE3BTNWND4BAROLIXJTW5/
  
  That would mean mid-2023 if I understood correctly the latest news about the 
Debian upgrades.

TASK DETAIL
  https://phabricator.wikimedia.org/T301908

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Majavah, Strainu, Mpaa, Framawiki, Dvorapa, Dalba, Meno25, valhallasw, 
Multichill, Rubin16, Basilicofresco, kscanne, Larske, Lee, Huji, Salween, 
JJMC89, Legoktm, matej_suchanek, Aklapper, Xqt, pywikibot-bugs-list, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


Re: [Talk-ro] Propunere edit automat: schimbare opearator=R.A.D.E.T. Constanta în operator=Termoficare Constanța pentru punctele termice din Constanța

2022-02-15 Thread Strainu
De acord.

Strainu

Pe marți, 15 februarie 2022, hazelnot via Talk-ro 
a scris:
> În momentul de față, punctele termice din Constanța au tagul de
operator=R.A.D.E.T. Constanta, sau nu au deloc tagul de operator. RADET
Constanța a dat faliment în 2021, și a fost restructurat ca Termoficare
Constanța S.R.L.
>
> Propun schimbarea tagului operator în operator=Termoficare Constanța
pentru toate punctele termice din oraș puse pe OSM, reparând tagurile
învechite și adăugând tagul punctelor termice care nu îl au.
>
> Ca să fac asta, voi folosi JOSM și API-ul Overpass, voi selecta toate
nodurile din oraș taguite cu industrial=heating_station și voi adăuga sau
schimba tagul de operator după caz, după care voi selecta toate căile cu
același tag (industrial=heating_station), cu excepția CET Palas, care
depinde de Electrocentrale Constanța, și voi adăuga sau schimba tagul de
operator după caz.
>
> Vor fi schimbat 137 de căi și 25 de noduri. Sunt peste 200 de puncte
termice în oraș din câte înțeleg, dar nu toate există pe OSM.
>
> Voi posta propunerea și pe mailing list-ul Talk-ro, și documentația se
află la
https://wiki.openstreetmap.org/wiki/Mechanical_Edits/hazelnot/Changing_opearator%3DR.A.D.E.T._Constanta_to_operator%3DTermoficare_Constan%C8%9Ba_for_heating_substations_in_Constan%C8%9Ba
>
> Dacă nu există obiecții, schimbarea va avea loc vineri, 18 februarie
2022, și îmi asum responabilitatea în caz că ceva nu merge ok.
>
>
>
> ___
> Talk-ro mailing list
> Talk-ro@openstreetmap.org
> https://lists.openstreetmap.org/listinfo/talk-ro
>
___
Talk-ro mailing list
Talk-ro@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk-ro


[Wikitech-l] Re: Is there still a maximum page size in effect?

2022-02-05 Thread Strainu
În sâm., 5 feb. 2022 la 19:53, Andre Klapper  a scris:
>
> On Sat, 2022-02-05 at 18:43 +0200, Strainu wrote:
> > I am aware of the various limits in the NewPP report. I'm trying to
> > determine if we currently we have some max page size (before or after
> > processing).
> >
> > The documentation on mw.org and en.wp is a bit confusing on the
> > subject and personal experimentation shows that substituting
> > templates allows be to go past the 2MiB page size.
>
> What does "personal experimentation" mean exactly? There might be
> exceptions like https://phabricator.wikimedia.org/T188852 but generally
> speaking, as neither
> https://noc.wikimedia.org/conf/InitialiseSettings.php.txt nor
> https://noc.wikimedia.org/conf/CommonSettings.php.txt seem to change
> the MediaWiki software default setting defined in
> https://phabricator.wikimedia.org/source/mediawiki/browse/master/includes/DefaultSettings.php$2673
> I'd assume that we're at 2MiB.

Hey Andre,

Thanks for taking the time to respond to my curiosity during the weekend.

Here is what I experimented with:
* for post-parser size (which would have been my first guess given
that for templates we count the *Post‐expand include size*) I just
measured the size of the .mw-parser-output div. For this, I took the
output of [1] (which is a mix of included and substituded templates
and is displayed just fine) and it was well over 8MiB.
* for pre-parser size, I took the same page [1], added some more
templates and started saving while substituting them (e.g. I have 1.5
MiB of text and a few thousand templates which I sustitute in one go).
One version which goes over the 2MiB limit is [2].

Now, because of the way those templates are written (pretty verbose
and with a lot of whitespace) the difference between the template and
the result of the substitution is small, so the page size is just over
the limit, but one can imagine a template expanding near 2MiB the
limit a couple of times could take a page to ~4MiB without much
effort.

I understand from your message that these are bugs and the limit is
still enforced. However, I still don't understand the logic in having
a limit for wikitext in pages, but a limit for post-expand (if I
understand correctly, that is after they go through the parser) in
templates. Why not have a single limit set to something like 8-12-16
MiB and counted after all the processing is done?

Thanks again,
   Strainu

[1] 
https://ro.wikipedia.org/w/index.php?title=Bunuri_mobile_din_domeniul_%C8%99tiin%C8%9Bele_naturii_clasate_%C3%AEn_patrimoniul_cultural_na%C8%9Bional_al_Rom%C3%A2niei_aflate_%C3%AEn_municipiul_Bucure%C8%99ti_(tezaur)=14712585
[2] 
https://ro.wikipedia.org/w/index.php?title=Utilizator:Strainu/2=14784282


>
> See also https://phabricator.wikimedia.org/T189108 about requesting an
> increase, and https://phabricator.wikimedia.org/T181907#3835654
> for some more background.
>
> Cheers,
> andre
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Is there still a maximum page size in effect?

2022-02-05 Thread Strainu
Hi,

I am aware of the various limits in the NewPP report. I'm trying to
determine if we currently we have some max page size (before or after
processing).

The documentation on mw.org and en.wp is a bit confusing on the subject and
personal experimentation shows that substituting templates allows be to go
past the 2MiB page size.

If we don't have such a limit, I'm not exactly sure why do we need the
Post‐expand include size limit? Why is output generated by transclusion
harder on the parser than output directly in the page?

Thanks,
 Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikidata-bugs] [Maniphest] T300209: Wrong label sort order in Wikidata Query Service

2022-01-26 Thread Strainu
Strainu created this task.
Strainu added a project: Wikidata-Query-Service.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  When sorting on a label, the sort order is the default (English?) one, 
instead of the sort order specific to the language specified by `SERVICE 
wikibase:label`. This affects the usability of results in wiki lists.
  
  Example query: https://w.wiki/4kNf
  
  Expected result: entries Șibot, Șona, Șpring and Șugag should have been 
placed between Săsciori and Teiuș instead of at the end, because the label 
language was specified as "ro".

TASK DETAIL
  https://phabricator.wikimedia.org/T300209

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, Strainu, MPhamWMF, CBogen, Namenlos314, Gq86, 
Lucas_Werkmeister_WMDE, EBjune, merbst, Jonas, Xmlizer, jkroll, Wikidata-bugs, 
Jdouglas, aude, Tobias1984, Manybubbles
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikimedia-l] Re: "content was" when deleting pages - is it useful?

2022-01-17 Thread Strainu
On ro.wp, we empty the summary when the reason is "Obscene content"
and leave it otherwise. For me, it used to be useful as a quick check
on admins. However, now that many deletions are made through Twinkle
and the Infoboxes are ubiquitous (taking up from the displayed text),
this is less useful.

Strainu

În lun., 17 ian. 2022 la 16:19, Amir E. Aharoni
 a scris:
>
> Hallo!
>
> There's an old MediaWiki feature: When an administrator deletes a page, a bit 
> of its content is automatically added to an edit summary. This is later 
> viewable in deletion logs.
>
> If you edit in the English, German, or Italian Wikipedia, then you haven't 
> actually seen this feature in years, because administrators in these wikis 
> essentially removed it by locally blanking the system messages that make it 
> work.
>
> In many other wikis, however, this feature is still working.
>
> Is it actually useful? Or should it perhaps be removed?
>
> Here's a Phabricator task about it:
> https://phabricator.wikimedia.org/T299351
>
> If you have an opinion, weigh in there or here.
>
> Thanks!
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
> ___
> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> Public archives at 
> https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/4ZONY3L5LEPO45POJ2SWTPHKFFIJ63UR/
> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/4DAJNQ2KFY2GQGBZ76LVSZ7XLP6PVMWH/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: Luis Bitencourt-Emilio Joins Wikimedia Foundation Board of Trustees

2022-01-14 Thread Strainu
Hello,

În vin., 14 ian. 2022 la 03:40, Luis Bitencourt-Emilio
 a scris:
>
> apologies to all for my late response.

Calling a next-day response "late" should gain you a goodwill or two
with some of us. :) Welcome and good luck in helping our CTOs!

>
> PS: I see a separate conversation has emerged relating to blockchains and my 
> interests in that field. I want to clarify that I don’t work professionally 
> in this field, and while I’m historically an early adopter of technology - in 
> the same way I adopted the internet in the 90s - I share many of the same 
> thoughts and questions about this new technology’s future that have been 
> raised in this thread. As a new Trustee, first and foremost, I am here to 
> learn and to hear more from all of you.

As always in our communities, negative feedback is far more visible.
But as Dariusz said, the community is divided on this. I personally
think, like Yair and Chris, that blockchains are not in opposition
with the movement values.

Strainu
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/ZSVSQQJ3E5FVANJ73YD3WC74ZGCVABDQ/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] Re: [Wikitech-l] Re: Re: Re: Uplifting the multimedia stack (was: Community Wishlist Survery)

2022-01-12 Thread Strainu
În mar., 11 ian. 2022 la 08:01, Kunal Mehta  a scris:
>
> So I think the status quo can be changed by just about anyone who is
> motivated to do so, not by trying to convince the WMF to change its
> prioritization, but just by doing the work. We should be empowering
> those people rather than continuing to further entrench a WMF technical
> monopoly.
>

Counterexample:
https://lists.wikimedia.org/hyperkitty/list/wikitec...@lists.wikimedia.org/message/G2QTRJFAUKLE45SFTFUHOOTOBR6G3DP3/
(this was the situation that I quoted in my first email on this thread
as the WMF refusing to even do reviews).

Maybe it's just the multimedia part that it's in this desperate
situation, but I can totally see volunteer developers getting
discouraged quickly if their patches are outright ignored.

Strainu
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/DU6BEXDGDN3UZLCKPR6LN7KIV45MPRRH/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikitech-l] Re: [Wikimedia-l] Re: Re: Uplifting the multimedia stack (was: Community Wishlist Survery)

2022-01-12 Thread Strainu
În mar., 11 ian. 2022 la 08:01, Kunal Mehta  a scris:
>
> So I think the status quo can be changed by just about anyone who is
> motivated to do so, not by trying to convince the WMF to change its
> prioritization, but just by doing the work. We should be empowering
> those people rather than continuing to further entrench a WMF technical
> monopoly.
>

Counterexample:
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/message/G2QTRJFAUKLE45SFTFUHOOTOBR6G3DP3/
(this was the situation that I quoted in my first email on this thread
as the WMF refusing to even do reviews).

Maybe it's just the multimedia part that it's in this desperate
situation, but I can totally see volunteer developers getting
discouraged quickly if their patches are outright ignored.

Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: [Wikimedia-l] Uplifting the multimedia stack (was: Community Wishlist Survery)

2021-12-30 Thread Strainu
> So where is the best current place to discuss scaling Commons, and all
that entails?

My impression is that we don't have one. All we hear is "it needs to be
planned", but there is no transparency on what that planning involves or
when it actually happens.

> I'd be surprised if the bottleneck were people or budget

The main problem I see is that we end up in this kind of situation. Scaling
and bug fixing critical features should be part of the annual budget. Each
line of code deployed to production wikis should have an owner and
associated maintenance budget each year. Without this, the team will not
even commit reviews - see the thread on wikitech a few months back where a
volunteer programmer willing to work on Upload Wizard was basically told
"We will not review your code. Go fork."

> Some examples from recent discussions

Also improvements to the Upload Wizard. There are quite a few open items in
Phab on this.

I really hope you will have better luck than others with bringing this
issue up in the priority list for next year - multimedia support is growing
more outdated by the minute.

Strainu

Pe joi, 30 decembrie 2021, Samuel Klein  a scris:
> Separate thread.  I'm not sure which list is appropriate.
> ... but not all the way to sentience.
>
> The annual community wishlist survey (implemented by a small team,
possibly in isolation?) may not be the mechanism for prioritizing large
changes, but the latter also deserves a community-curated priority queue.
To complement the staff-maintained priorities in phab ~
> For core challenges (like Commons stability and capacity), I'd be
surprised if the bottleneck were people or budget.  We do need a shared
understanding of what issues are most important and most urgent, and how to
solve them. For instance, a way to turn Amir's recent email about the
problem (and related phab tickets) into a family of persistent,
implementable specs and proposals and their articulated obstacles.
> An issue tracker like phab is good for tracking the progress and
dependencies of agreed-upon tasks, but weak for discussing what is
important, what we know about it, how to address it. And weak for
discussing ecosystem-design issues that are important and need persistent
updating but don't have a simple checklist of steps.
> So where is the best current place to discuss scaling Commons, and all
that entails?  Some examples from recent discussions (most from the wm-l
thread below):
> - Uploads: Support for large file uploads / Keeping bulk upload tools
online
> - Video: Debugging + rolling out the videojs player
> - Formats: Adding support for CML and dozens of other common high-demand
file formats
> - Thumbs: Updating thumbor and librsvg
> - Search: WCQS still down, noauth option wanted for tools
> - General: Finish implementing redesign of the image table
>
> SJ
> On Wed, Dec 29, 2021 at 6:26 AM Amir Sarabadani 
wrote:
>>
>> I'm not debating your note. It is very valid that we lack proper support
for multimedia stack. I myself wrote a detailed rant on how broken it is
[1] but three notes:
>>  - Fixing something like this takes time, you need to assign the budget
for it (which means it has to be done during the annual planning) and if
gets approved, you need to start it with the fiscal year (meaning July
2022) and then hire (meaning, write JD, do recruitment, interview lots of
people, get them hired) which can take from several months to years. Once
they are hired, you need to onboard them and let them learn about our
technical infrastructure which takes at least two good months. Software
engineering is not magic, it takes time, blood and sweat. [2]
>>  - Making another team focus on multimedia requires changes in planning,
budget, OKR, etc. etc. Are we sure moving the focus of teams is a good
idea? Most teams are already focusing on vital parts of wikimedia and
changing the focus will turn this into a whack-a-mole game where we fix
multimedia but now we have critical issues in security or performance.
>>  - Voting Wishlist survey is a good band-aid in the meantime. To at
least address the worst parts for now.
>>
>> I don't understand your point tbh, either you think it's a good idea to
make requests for improvements in multimedia in the wishlist survey or you
think it's not. If you think it's not, then it's offtopic to this thread.
>> [1]
https://lists.wikimedia.org/hyperkitty/list/wikimedi...@lists.wikimedia.org/message/WMPZHMXSLQJ6GONAVTFLDFFMPNJDVORS/
>> [2] There is a classic book in this topic called "The Mythical Man-month"
>>
>> On Wed, Dec 29, 2021 at 11:41 AM Gnangarra  wrote:
>>>
>>> we have to vote for regular maintenance and support for
essential functions like uploading files which is the core mission of
Wikimedia Commons
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikimedia-l] Re: Uplifting the multimedia stack (was: Community Wishlist Survery)

2021-12-30 Thread Strainu
> So where is the best current place to discuss scaling Commons, and all
that entails?

My impression is that we don't have one. All we hear is "it needs to be
planned", but there is no transparency on what that planning involves or
when it actually happens.

> I'd be surprised if the bottleneck were people or budget

The main problem I see is that we end up in this kind of situation. Scaling
and bug fixing critical features should be part of the annual budget. Each
line of code deployed to production wikis should have an owner and
associated maintenance budget each year. Without this, the team will not
even commit reviews - see the thread on wikitech a few months back where a
volunteer programmer willing to work on Upload Wizard was basically told
"We will not review your code. Go fork."

> Some examples from recent discussions

Also improvements to the Upload Wizard. There are quite a few open items in
Phab on this.

I really hope you will have better luck than others with bringing this
issue up in the priority list for next year - multimedia support is growing
more outdated by the minute.

Strainu

Pe joi, 30 decembrie 2021, Samuel Klein  a scris:
> Separate thread.  I'm not sure which list is appropriate.
> ... but not all the way to sentience.
>
> The annual community wishlist survey (implemented by a small team,
possibly in isolation?) may not be the mechanism for prioritizing large
changes, but the latter also deserves a community-curated priority queue.
To complement the staff-maintained priorities in phab ~
> For core challenges (like Commons stability and capacity), I'd be
surprised if the bottleneck were people or budget.  We do need a shared
understanding of what issues are most important and most urgent, and how to
solve them. For instance, a way to turn Amir's recent email about the
problem (and related phab tickets) into a family of persistent,
implementable specs and proposals and their articulated obstacles.
> An issue tracker like phab is good for tracking the progress and
dependencies of agreed-upon tasks, but weak for discussing what is
important, what we know about it, how to address it. And weak for
discussing ecosystem-design issues that are important and need persistent
updating but don't have a simple checklist of steps.
> So where is the best current place to discuss scaling Commons, and all
that entails?  Some examples from recent discussions (most from the wm-l
thread below):
> - Uploads: Support for large file uploads / Keeping bulk upload tools
online
> - Video: Debugging + rolling out the videojs player
> - Formats: Adding support for CML and dozens of other common high-demand
file formats
> - Thumbs: Updating thumbor and librsvg
> - Search: WCQS still down, noauth option wanted for tools
> - General: Finish implementing redesign of the image table
>
> SJ
> On Wed, Dec 29, 2021 at 6:26 AM Amir Sarabadani 
wrote:
>>
>> I'm not debating your note. It is very valid that we lack proper support
for multimedia stack. I myself wrote a detailed rant on how broken it is
[1] but three notes:
>>  - Fixing something like this takes time, you need to assign the budget
for it (which means it has to be done during the annual planning) and if
gets approved, you need to start it with the fiscal year (meaning July
2022) and then hire (meaning, write JD, do recruitment, interview lots of
people, get them hired) which can take from several months to years. Once
they are hired, you need to onboard them and let them learn about our
technical infrastructure which takes at least two good months. Software
engineering is not magic, it takes time, blood and sweat. [2]
>>  - Making another team focus on multimedia requires changes in planning,
budget, OKR, etc. etc. Are we sure moving the focus of teams is a good
idea? Most teams are already focusing on vital parts of wikimedia and
changing the focus will turn this into a whack-a-mole game where we fix
multimedia but now we have critical issues in security or performance.
>>  - Voting Wishlist survey is a good band-aid in the meantime. To at
least address the worst parts for now.
>>
>> I don't understand your point tbh, either you think it's a good idea to
make requests for improvements in multimedia in the wishlist survey or you
think it's not. If you think it's not, then it's offtopic to this thread.
>> [1]
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/WMPZHMXSLQJ6GONAVTFLDFFMPNJDVORS/
>> [2] There is a classic book in this topic called "The Mythical Man-month"
>>
>> On Wed, Dec 29, 2021 at 11:41 AM Gnangarra  wrote:
>>>
>>> we have to vote for regular maintenance and support for
essential functions like uploading files which is the core mission of
Wikimedia Commons
___
Wikimedia-l mailing list --

[Wikitech-l] Limit to the number of images in a page?

2021-11-29 Thread Strainu
Hi,

I have some wikipages with a large number of images (1000+). Those
pages never load completely, as upload.wikimedia.org starts returning
429 Too many requests after a while.

This limit does not seem to be documented on mediawiki.org, so I would
like to know what it the exact value and if there is a way to work
around it (except for splitting the pages).

Thanks,
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[pywikibot] Re: Deprecation policy?

2021-11-13 Thread Strainu
Hi Huji,

Thanks for your message, it's very thoughtful and raises many valid points.
The three year limit for deprecation should be trivial to enforce using git
hooks and I can certainly help put it in place.

Now for the Python support. I totally support setting up checks and tests,
although I don't consider them critical. If incompatible code does pass
through review (it has happened before during GSOC) I feel confident the
community will catch that in a few days or weeks. If we can have
automation, all the better.

Also, I don't have any bias against a single nimerical limit myself. The
algorithm I proposed was based on the issues raised in the latest
deprecation discussion ( https://phabricator.wikimedia.org/T286867 ). If
all these can be handled in a single limit, then that is clearly preferable
and I support it. Note that the limit might well be larger than 7 years in
that case.

Regards,
Strainu

Pe sâmbătă, 13 noiembrie 2021, Huji Lee  a scris:
> Hi Strainu,
> Thanks for bringing this up. And sorry to hear that many have had
negative experiences with recent removals of deprecated code (though I had
nothing to do with it).
> The Pywikibot project is barely maintained; a small community of
interested folks are pushing it forward, and most code gets merged without
review. I don't think we can claim, in good conscience, that there are
folks who still keep a full architectural view of all elements of this
project (as is the case with MediaWiki and many other tools we use daily).
> The point is: if the framework is too difficult to execute, it won't
happen. So, although I like your proposed framework, I find it unrealistic
to happen. It would assume someone is pulling data on Python version usage,
and they would be involved in code review in such a way to ensure that code
incompatible with legacy Python versions is not introduced too soon.
> I would encourage us to come up with a framework that is easier to
maintain, and mostly automatable. Maybe we should just use the "7 year"
rule, make sure that we have a CI pipeline that tries to build Pywikibot
for all Python version that are currently within that 7 years, and at the
top of the CI pipelines' code we have a comment that educates the users who
may decide to edit it when the 7 year period ends for each existing version
of Python. This way, incompatible changes would just fail on CI, and the
enforcement of rules is quite simple, and breaking them by accident (such
as by someone unknowingly modifying the CI pipeline) is less likely. This
was just an idea, and may not be the right way, but I hope it shows what I
mean by simplifying matters.
> Similarly, maybe we should find a way to create a CI pipeline that would
check for "@deprecated" methods, and cross-checks it with a reference JSON
file in which each deprecated function is listed along with its deprecation
date. If someone tries to remove the function before that date + 3 months,
the CI would fail. If someone tries to add "@deprecated" without updating
the JSON file, the CI would fail. This way, we have one single source of
truth about when things were marked as deprecated (the JSON file) and an
easy mechanism to track when deprecated methods are getting removed. Again,
just a raw idea.
>
>
> On Sat, Nov 13, 2021 at 7:13 AM Strainu  wrote:
>>
>> Hi folks,
>>
>> I'd like to bring this thread back to life since in the months passed
>> *a lot* of deprecated code has been removed, some of which was quite
>> "recent". I totally understand that projects move forward, and so does
>> Python, I totally understand the developer's whish to use new language
>> features and I don't believe we should keep deprecated code for
>> decades, as it often happens in wikis. However, given that the vast
>> majority of Pywikibot users are busy volunteers and that many projects
>> depend on (mostly unsupervised) pywikibot code for critical
>> maintenance work, I believe we need some kind of pedictibility on
>> deprecation.
>>
>> As a user, I would like to understand:
>> 1. when and how can a function become deprecated
>> 2. when and how can a parameter become deprecated
>> 3. how long will I still be able to use a deprecated function or
parameter
>> 4. what Python versions will Pywikibot support.
>>
>> For 3, I propose to maintain a compatibility for *at least* 3 years.
>> This roughly matches the Debian lifecycle, as the longest-maintained
>> non-LTS release of major Linux distributions.
>>
>> For 4, I propose to support a Python version (e.g. 3.5) untill all of
>> the following are true (that is, the longest period between them):
>>  - two years after official support has ended (e.g. 7 years after
launch) AND
>>  - one year after Toolforge moved to a newer Python version. AND
>>  -

[pywikibot] Re: Deprecation policy?

2021-11-13 Thread Strainu
Hi folks,

I'd like to bring this thread back to life since in the months passed
*a lot* of deprecated code has been removed, some of which was quite
"recent". I totally understand that projects move forward, and so does
Python, I totally understand the developer's whish to use new language
features and I don't believe we should keep deprecated code for
decades, as it often happens in wikis. However, given that the vast
majority of Pywikibot users are busy volunteers and that many projects
depend on (mostly unsupervised) pywikibot code for critical
maintenance work, I believe we need some kind of pedictibility on
deprecation.

As a user, I would like to understand:
1. when and how can a function become deprecated
2. when and how can a parameter become deprecated
3. how long will I still be able to use a deprecated function or parameter
4. what Python versions will Pywikibot support.

For 3, I propose to maintain a compatibility for *at least* 3 years.
This roughly matches the Debian lifecycle, as the longest-maintained
non-LTS release of major Linux distributions.

For 4, I propose to support a Python version (e.g. 3.5) untill all of
the following are true (that is, the longest period between them):
 - two years after official support has ended (e.g. 7 years after launch) AND
 - one year after Toolforge moved to a newer Python version. AND
 - the percentage of users of a version goes under 5%

This should allow both Toolforge and independent users ample time to
update their code without surprises, even when using LTS releases.

What do you think? I would love to see feedback from both developers
and users on these questions and possible answers. Even if you don't
agree with these proposals, please make your own, so that we can
hopefully agree on some rules - any deadlines would be better than
none.

Regards,
   Strainu

În vin., 30 apr. 2021 la 00:33, Kunal Mehta  a scris:
>
> Hi Damian,
>
> On 4/21/21 2:32 PM, Damian Johnson wrote:
> > What is pywikibot's policy regarding code deprecation? Can we remove
> > it after a set duration and, if so, what is it?
>
> I'm not aware of Pywikibot having such a policy, but I think it would be
> a good idea to have one. MediaWiki has a stable interface policy[1]
> which defines what parts are stable to build on top of and which are
> considered internal and then a process on how to deprecate and make
> changes to what's supposed to be stable.
>
> One of the things I worked on for MediaWiki's deprecation process is
> developing codesearch[2] which makes it pretty straightforward for
> developers to see what methods/functions are practically being used and
> see what use cases are. I think something like that would be valuable
> for Pywikibot as well, but code for most bots/scripts is really all over
> the place. Something like Toolhub[3] would help with this too.
>
> [1] https://www.mediawiki.org/wiki/Stable_interface_policy
> [2] https://codesearch.wmcloud.org/search/
> [3] https://meta.wikimedia.org/wiki/Toolhub
>
> HTH,
> -- Legoktm
>
> ___
> pywikibot mailing list
> pywikibot@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikibot
___
pywikibot mailing list -- pywikibot@lists.wikimedia.org
To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-06 Thread Strainu
Strainu added a comment.


  Indeed, removing `config.step` from user-config.py solved the problem. I 
don't remember why I added that, maybe it's an inheritance from old versions (I 
have that config file all the way back from compat branch :D )

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: matej_suchanek, Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-06 Thread Strainu
Strainu added a comment.


>>> page.isRedirectPage()
>>> _handle_query_limit
None None True
1000 None
{'_count': 0,
 '_previous_dicts': {},
 '_props': frozenset({'info'}),
 'api_limit': 1000,
 'continue_name': 'continue',
 'continue_update': >,
 'continuekey': ['info'],
 'limit': None,
 'limited_module': None,
 'modules': ['info'],
 'query_limit': 1000,
 'request': 
pywikibot.data.api.Request'/w/api.php?continue==query=info=Cod:SIRUTA:136=protection='>,
 'request_class': ,
 'resultkey': 'pages',
 'site': APISite("ro", "wikipedia")}
<<<

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: matej_suchanek, Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-04 Thread Strainu
Strainu added a comment.


$ git status
On branch master
Your branch is up-to-date with 'origin/master'.
Untracked files:
  (use "git add ..." to include in what will be committed)

[...]

nothing added to commit but untracked files present (use "git add" to track)

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: matej_suchanek, Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-04 Thread Strainu
Strainu added a comment.


$ python3 pwb.py version
Pywikibot: [https] r-pywikibot-core (758de7a, g15553, 2021/11/03, 12:54:56, 
OUTDATED)
Release version: 7.0.0.dev0
setuptools version: 33.1.1
mwparserfromhell version: 0.6.3
wikitextparser version: n/a
requests version: 2.25.1
  cacerts: 
/home/andrei/.local/lib/python3.5/site-packages/certifi/cacert.pem
certificate test: ok
Python: 3.5.3 (default, Apr  5 2021, 09:00:41)
[GCC 6.3.0 20170516]
PYWIKIBOT_DIR: Not set
PYWIKIBOT_DIR_PWB:
PYWIKIBOT_NO_USER_CONFIG: Not set
Config base dir: /home/andrei/pywikibot-core
Usernames for family "commons":
commons: Strainubot
Usernames for family "wikidata":
wikidata: Strainubot
Usernames for family "wikipedia":
ro: Strainubot

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: matej_suchanek, Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Updated] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-03 Thread Strainu
Strainu added a comment.


  Thank you for your suggestions @Xqt abd @matej_suchanek . Unfortunately I am 
still reproducing the issue after deleting all __pycache__ and apicache-py3 
folders, using the simplified example (see below). Have you tried with the same 
commit (28e31f98dcc2b152fde16385172a49f721394ed3 
<https://phabricator.wikimedia.org/rPWBC28e31f98dcc2b152fde16385172a49f721394ed3>)?
  
>>> import pywikibot
>>> site = pywikibot.Site('ro')
>>> page = pywikibot.Page(site, 'Cod:SIRUTA:136')
>>> page.exists()
Traceback (most recent call last):
  File "", line 1, in 
  File "/home/andrei/pywikibot-core/pywikibot/page/__init__.py", line 717, 
in exists
raise InvalidPageError(self)
pywikibot.exceptions.InvalidPageError: Page [[ro:Cod:SIRUTA:136]] is 
invalid.
>>> page.isRedirectPage()
Traceback (most recent call last):
  File "", line 1, in 
  File "/home/andrei/pywikibot-core/pywikibot/page/__init__.py", line 730, 
in isRedirectPage
return self.site.page_isredirect(self)
  File "/home/andrei/pywikibot-core/pywikibot/site/_apisite.py", line 1194, 
in page_isredirect
self.loadpageinfo(page)
  File "/home/andrei/pywikibot-core/pywikibot/site/_apisite.py", line 1107, 
in loadpageinfo
self._update_page(page, query)
  File "/home/andrei/pywikibot-core/pywikibot/site/_apisite.py", line 1081, 
in _update_page
for pageitem in query:
  File "/home/andrei/pywikibot-core/pywikibot/data/api.py", line 2740, in 
__iter__
yield from super().__iter__()
  File "/home/andrei/pywikibot-core/pywikibot/data/api.py", line 2582, in 
__iter__
prev_limit, new_limit, previous_result_had_data)
  File "/home/andrei/pywikibot-core/pywikibot/data/api.py", line 2507, in 
_handle_query_limit
self.request[self.prefix + 'limit'] = str(new_limit)
AttributeError: 'PropertyGenerator' object has no attribute 'prefix'

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: matej_suchanek, Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-02 Thread Strainu
Strainu added a comment.


  Pretty much any page, I've encountered it several times. Just checked for 
[[ro:Cod:SIRUTA:136]], it reproduces

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: matej_suchanek, Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, 
JohnsonLee01, SHEKH, Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, 
Mdupont, JJMC89, Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, 
Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Edited] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-02 Thread Strainu
Strainu updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, JohnsonLee01, SHEKH, 
Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, Mdupont, JJMC89, 
Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Commented On] T293820: UploadError is not thrown to the client code when some warnings are ignored

2021-11-02 Thread Strainu
Strainu added a comment.


  I agree, it would have been preferable if the change in semantics came 
through a rename of parameters instead.

TASK DETAIL
  https://phabricator.wikimedia.org/T293820

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Xqt, Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, JohnsonLee01, SHEKH, 
Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, Mdupont, JJMC89, 
Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Pywikipedia-bugs] [Maniphest] [Created] T294836: Calls to pywikibot.Page.exists() throw an InvalidPageError

2021-11-02 Thread Strainu
Strainu created this task.
Strainu added a project: Pywikibot.
Restricted Application added subscribers: pywikibot-bugs-list, Aklapper.

TASK DESCRIPTION
  **Sample code** (simplified):
  
user.mylang = 'ro'
user.family = 'wikipedia'
site = pywikibot.Site()
page = pywikibot.Page(site, result['page_title'])
pywikibot.output(page.title())
if page.exists() and page.isRedirectPage():
page = page.getRedirectTarget()
elif not page.exists():
pass # DO stuff
  
  **Expected outcome:** get the redirect target or handle non-existent pages
  
  **Actual outcome:**
  
Traceback (most recent call last):
  File "pwb.py", line 420, in 
if not main():
  File "pwb.py", line 415, in main
module)
  File "pwb.py", line 113, in run_python_file
main_mod.__dict__)
  File "./wikiro/robots/python/pywikipedia/localitati/create_shortcuts.py", 
line 80, in 
main()
  File "./wikiro/robots/python/pywikipedia/localitati/create_shortcuts.py", 
line 64, in main
if page.exists() and page.isRedirectPage():
  File "/home/andrei/pywikibot-core/pywikibot/page/__init__.py", line 717, 
in exists
raise InvalidPageError(self)
  
  Trying to debug, we add a print:
  
$ git diff pywikibot/page/__init__.py
diff --git a/pywikibot/page/__init__.py b/pywikibot/page/__init__.py
index ee06a0203..fa52faec5 100644
--- a/pywikibot/page/__init__.py
+++ b/pywikibot/page/__init__.py
@@ -714,6 +714,7 @@ class BasePage(ComparableMixin):
 """
 with suppress(AttributeError):
 return self.pageid > 0
+print(self.pageid)
 raise InvalidPageError(self)

 @property
  
  Sure enough, we had an AttributeError:
  
Traceback (most recent call last):
  File "pwb.py", line 420, in 
if not main():
  File "pwb.py", line 415, in main
module)
  File "pwb.py", line 113, in run_python_file
main_mod.__dict__)
  File "./wikiro/robots/python/pywikipedia/localitati/create_shortcuts.py", 
line 80, in 
main()
  File "./wikiro/robots/python/pywikipedia/localitati/create_shortcuts.py", 
line 64, in main
if source_page.exists():
  File "/home/andrei/pywikibot-core/pywikibot/page/__init__.py", line 717, 
in exists
print(self.pageid)
  File "/home/andrei/pywikibot-core/pywikibot/page/__init__.py", line 261, 
in pageid
self.site.loadpageinfo(self)
  File "/home/andrei/pywikibot-core/pywikibot/site/_apisite.py", line 1107, 
in loadpageinfo
self._update_page(page, query)
  File "/home/andrei/pywikibot-core/pywikibot/site/_apisite.py", line 1081, 
in _update_page
for pageitem in query:
  File "/home/andrei/pywikibot-core/pywikibot/data/api.py", line 2740, in 
__iter__
yield from super().__iter__()
  File "/home/andrei/pywikibot-core/pywikibot/data/api.py", line 2582, in 
__iter__
prev_limit, new_limit, previous_result_had_data)
  File "/home/andrei/pywikibot-core/pywikibot/data/api.py", line 2507, in 
_handle_query_limit
self.request[self.prefix + 'limit'] = str(new_limit)
AttributeError: 'PropertyGenerator' object has no attribute 'prefix'
CRITICAL: Exiting due to uncaught exception 
  
  **Pywikibot version**:
  
=== Pywikibot framework v7.0.0.dev0 -- Logging header ===   
  [80/1812]
COMMAND: ['pwb.py', '-v']
DATE: 2021-11-02 13:35:29.761891 UTC
VERSION: [https] r-pywikibot-core (28e31f9, g15398, 2021/09/21, 11:52:45, 
n/a)
SYSTEM: 

TASK DETAIL
  https://phabricator.wikimedia.org/T294836

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, JohnsonLee01, SHEKH, 
Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, Mdupont, JJMC89, 
Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Wikitech-l] Re: How do I make VE *not* save/recover my changes?

2021-10-27 Thread Strainu
Thanks David. I just tried this and it only seems to work if I go back to
the article via the tabs. Any other tab or link doesn't work. I've logged
https://phabricator.wikimedia.org/T294463 so the team can check if this is
intended or not.

Have a good day,
Strainu

Pe miercuri, 27 octombrie 2021, David Lynch  a scris:
> Leave the editing mode "cleanly" -- saving, navigating back to the
article via the tabs, following a sidebar link to another page, hitting
escape, whatever. If you stop editing in a way that we can tell is
intentional, you'll be asked if you want to discard your changes, and if
you say that you do then it'll all get cleaned up. (We can't tell the
difference between "I deliberately closed this tab because I want to get
rid of this" and "I accidentally closed the wrong tab and I'll be very
upset if my changes are lost", unfortunately...)
> If you want it to never autosave your changes, you could disable local
session storage for the wikis you use at the browser level. But that might
have side-effects outside of VE.
> We don't have any preferences that'd control the autosave, and I don't
think that we have any current plans to implement something like that.
> ~David
> On Wed, Oct 27, 2021 at 11:14 AM Strainu  wrote:
>>
>> Hi,
>>
>> I've been searching quite a bit on MediaWiki.org but I can't find how
>> to tell the VisualEditor to stop saving and (especially) recovering
>> changes that I haven't explicitly saved. Is there a method?
>>
>> Thanks,
>> Strainu
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] How do I make VE *not* save/recover my changes?

2021-10-27 Thread Strainu
Hi,

I've been searching quite a bit on MediaWiki.org but I can't find how
to tell the VisualEditor to stop saving and (especially) recovering
changes that I haven't explicitly saved. Is there a method?

Thanks,
Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikimedia-l] Re: Dynamic content on Wikipedia (was: An Uzbek praktical joke and Wikimedia Enterprise)

2021-10-26 Thread Strainu
Pe marți, 26 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
> I think that the WMF has a whole departament devoted to product.

They still need to balance between the requests of millions of readers, 70K
users and tens of affiliates.


> This is not about someone trying to get some money, is about University
professors that have asked directly how they can upload their dynamic
content to wikipedia and they didn't have a way.

What format/software/technologies are they using? How are they publishing
on the web? Is there a free alternative?

How often do they update their visualizations? How many articles would
benefit from such a feature? Can a mathematician or a biologist reuse the
same software? Can a template editor edit such a visualization?

These are all details of a simple question: are you sure this can be done
within the constraints of the movement?

Also keep in mind that "Dynamic content" is a huge umbrella. Other parts of
the puzzle are far more advanced, clearly more impactul and still not
getting enough love.

Strainu


How can a regular Physicist in Uzbekistan upload it without knowing someone
who knows someone who could fill a grant proposal? Is like asking a grant
proposal to be able to upload a video!
>
>
>
> 2021(e)ko urr. 26(a) 20:13 erabiltzaileak hau idatzi du (Strainu <
strain...@gmail.com>):
>
> Changing subject, this is no joke.
>
> Pe marți, 26 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
>> "Don't assume that just because you can't do something it's impossible
or even particularly hard. "
>> I don't assume it, just we can't do it:
https://phabricator.wikimedia.org/T169027 or
https://phabricator.wikimedia.org/T238259
>
> As a matter of fact, these 2 tickets say it's totally possible to have
dynamic content on Wikipedia, just not by an average user. They even have
examples of dynamic content.
>
> What I don't see there (I just skimmed the content though) is a list of
requirements for what we want to achieve. Are there big classes of similar
visualizations that could be done with simple customizations that a
semi-technical person (think:excel user) could do?
>
> You could start from there and have someone write a project grant
proposal for such a project.
>
> Strainu
>>
>>
>> If you know a way to do this kind of interactive content in any given
wiki, we could go forward fast.
>> 
>> From: Strainu 
>> Sent: Tuesday, October 26, 2021 7:45 PM
>> To: Wikimedia Mailing List 
>> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>>
>>
>> Pe marți, 26 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
>>> Anders: we can't add a physics simulator.
>>
>> We totally can. It takes programming knowledge and a technical
administrator, but it's possible.
>>
>> Don't assume that just because you can't do something it's impossible or
even particularly hard. What's nearly impossible is to scale such
initiatives in a meaningful manner (I.e. over 250+ languages).
>>
>> Strainu
>>
>> (https://www.physicsclassroom.com/Physics-Interactives/Newtons-Laws).
This is not "info wars", this is being useful. And we can't do it
because... well, because we are... obsolete.
>>>
>>> 
>>> From: Anders Wennersten 
>>> Sent: Tuesday, October 26, 2021 6:02 PM
>>> To: wikimedia-l@lists.wikimedia.org 
>>> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>>>
>>>
>>> We have an army of volunteers to guarantee correctness and that issues
of controversies are dealt with in a way that hopefully all parties can
accept
>>>
>>> we have no cookies or technical things that make us follow up on our
editors, truly believing in the full integrity of our users
>>>
>>> our financial and governing set up is fully independent of an third
party
>>>
>>> Our reading interface works well for our users and on most platforms
(which is made easier with no technical smarties)
>>>
>>> Our interface can be made better for editors, but this does not make it
as a phenomenon obsolete
>>>
>>> In the info war we are in, it is beer to be on the "boring" side with
few or none smart gadgets then being too smart and open for foul play by
parties that want to undermine our system by clever hackers
>>>
>>> Anders
>>>
>>> Den 2021-10-26 kl. 17:37, skrev Galder Gonzalez Larrañaga:
>>>
>>> Thanks Anders,
>>> "We are the opposite to obsolete" is a good sentence, because

[Wikimedia-l] Dynamic content on Wikipedia (was: An Uzbek praktical joke and Wikimedia Enterprise)

2021-10-26 Thread Strainu
Changing subject, this is no joke.

Pe marți, 26 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
> "Don't assume that just because you can't do something it's impossible or
even particularly hard. "
> I don't assume it, just we can't do it:
https://phabricator.wikimedia.org/T169027 or
https://phabricator.wikimedia.org/T238259

As a matter of fact, these 2 tickets say it's totally possible to have
dynamic content on Wikipedia, just not by an average user. They even have
examples of dynamic content.

What I don't see there (I just skimmed the content though) is a list of
requirements for what we want to achieve. Are there big classes of similar
visualizations that could be done with simple customizations that a
semi-technical person (think:excel user) could do?

You could start from there and have someone write a project grant proposal
for such a project.

Strainu
>
>
> If you know a way to do this kind of interactive content in any given
wiki, we could go forward fast.
> ________
> From: Strainu 
> Sent: Tuesday, October 26, 2021 7:45 PM
> To: Wikimedia Mailing List 
> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>
>
> Pe marți, 26 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
>> Anders: we can't add a physics simulator.
>
> We totally can. It takes programming knowledge and a technical
administrator, but it's possible.
>
> Don't assume that just because you can't do something it's impossible or
even particularly hard. What's nearly impossible is to scale such
initiatives in a meaningful manner (I.e. over 250+ languages).
>
> Strainu
>
> (https://www.physicsclassroom.com/Physics-Interactives/Newtons-Laws).
This is not "info wars", this is being useful. And we can't do it
because... well, because we are... obsolete.
>>
>> 
>> From: Anders Wennersten 
>> Sent: Tuesday, October 26, 2021 6:02 PM
>> To: wikimedia-l@lists.wikimedia.org 
>> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>>
>>
>> We have an army of volunteers to guarantee correctness and that issues
of controversies are dealt with in a way that hopefully all parties can
accept
>>
>> we have no cookies or technical things that make us follow up on our
editors, truly believing in the full integrity of our users
>>
>> our financial and governing set up is fully independent of an third party
>>
>> Our reading interface works well for our users and on most platforms
(which is made easier with no technical smarties)
>>
>> Our interface can be made better for editors, but this does not make it
as a phenomenon obsolete
>>
>> In the info war we are in, it is beer to be on the "boring" side with
few or none smart gadgets then being too smart and open for foul play by
parties that want to undermine our system by clever hackers
>>
>> Anders
>>
>> Den 2021-10-26 kl. 17:37, skrev Galder Gonzalez Larrañaga:
>>
>> Thanks Anders,
>> "We are the opposite to obsolete" is a good sentence, because this would
imply that our platform is the bow of an icebreaker. But we still, in 2021,
can't do this things (you can help by expanding this list):
>>
>> Simultaneous edition
>> Auto-save in sandbox
>> Publishing from sandbox
>> Upload MP4 files
>> Render correctly vectorial files
>> Embed our own Wikidata query results in our own projects
>> Have a modern look
>> Have cross-project templates and modules
>> Visual edit from mobile
>> Create visually interesting cartography
>> Hear the articles
>> Export multiple articles as a pdf/doc (whatever)
>> ...
>> 
>>
>> Someone will answer to this message talking about the "Wishlist survey"
every year we have. This scarcity generating system also gives funny
outcomes. Let's take the 2019 survey. 10 projects were voted. Only 4 done:
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019/Results. Or
the 2017 one:
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2017/Results.
Some projects where done, some not and there are some that are external
tools that you have to use as a gadget.
>>
>> Students are relying on YouTube to learn things. We are obsolete. Very
obsolete.
>> Galder
>>
>>
>> 
>> From: Anders Wennersten 
>> Sent: Tuesday, October 26, 2021 5:23 PM
>> To: wikimedia-l@lists.wikimedia.org 
>> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>>
>> "We will have more and more and more millions, but we will still...
&

[Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia Enterprise

2021-10-26 Thread Strainu
Pe marți, 26 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
> Anders: we can't add a physics simulator.

We totally can. It takes programming knowledge and a technical
administrator, but it's possible.

Don't assume that just because you can't do something it's impossible or
even particularly hard. What's nearly impossible is to scale such
initiatives in a meaningful manner (I.e. over 250+ languages).

Strainu

(https://www.physicsclassroom.com/Physics-Interactives/Newtons-Laws). This
is not "info wars", this is being useful. And we can't do it because...
well, because we are... obsolete.
>
> 
> From: Anders Wennersten 
> Sent: Tuesday, October 26, 2021 6:02 PM
> To: wikimedia-l@lists.wikimedia.org 
> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>
>
> We have an army of volunteers to guarantee correctness and that issues of
controversies are dealt with in a way that hopefully all parties can accept
>
> we have no cookies or technical things that make us follow up on our
editors, truly believing in the full integrity of our users
>
> our financial and governing set up is fully independent of an third party
>
> Our reading interface works well for our users and on most platforms
(which is made easier with no technical smarties)
>
> Our interface can be made better for editors, but this does not make it
as a phenomenon obsolete
>
> In the info war we are in, it is beer to be on the "boring" side with few
or none smart gadgets then being too smart and open for foul play by
parties that want to undermine our system by clever hackers
>
> Anders
>
> Den 2021-10-26 kl. 17:37, skrev Galder Gonzalez Larrañaga:
>
> Thanks Anders,
> "We are the opposite to obsolete" is a good sentence, because this would
imply that our platform is the bow of an icebreaker. But we still, in 2021,
can't do this things (you can help by expanding this list):
>
> Simultaneous edition
> Auto-save in sandbox
> Publishing from sandbox
> Upload MP4 files
> Render correctly vectorial files
> Embed our own Wikidata query results in our own projects
> Have a modern look
> Have cross-project templates and modules
> Visual edit from mobile
> Create visually interesting cartography
> Hear the articles
> Export multiple articles as a pdf/doc (whatever)
> ...
> 
>
> Someone will answer to this message talking about the "Wishlist survey"
every year we have. This scarcity generating system also gives funny
outcomes. Let's take the 2019 survey. 10 projects were voted. Only 4 done:
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019/Results. Or
the 2017 one:
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2017/Results.
Some projects where done, some not and there are some that are external
tools that you have to use as a gadget.
>
> Students are relying on YouTube to learn things. We are obsolete. Very
obsolete.
> Galder
>
>
> 
> From: Anders Wennersten 
> Sent: Tuesday, October 26, 2021 5:23 PM
> To: wikimedia-l@lists.wikimedia.org 
> Subject: [Wikimedia-l] Re: An Uzbek praktical joke and Wikimedia
Enterprise
>
> "We will have more and more and more millions, but we will still...
> yes... obsolete." Galder
>
>
> What phenomenon do you see challenge Wikipedias role as a source for
> common knowledge, an encyklopedia for everyone?
>
> I see that for the last 20 years no successful commercial encyclopedia
> has been launched.
>
> I see how the social media have a hard time to be a platform for common
> knowledge and hard pressed to employ armies of moderators. And Google
> very happy to lean and steal from Wikipedia rather the do something
> similar themself (which would go down badly in the public)
>
> But the war of information is a reality and heating up. We can be very
> glad that so far we have not been a target of all angriness of what is
> to be seen as the "correct" information. But that could change, what if
> a new administration in US want to control what is written in Wikipedia.
> Or China want to set up a parallel in English as the have now in
> Chinese. If these thing happen we need to have resources to fight off
> these type a of challenges, not only for our own sake but for he people
> in the world who is used to turn to Wikipedia for basic facts.
>
> We are the opposite to obsolete, we are in the front seat and driving
> for correct facts in the emerging information war we now see
>
> Anders
> ___
> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines
at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines a

[Wikimedia-l] Re: What happened to the cc-by-sa-4.0 initiative?

2021-10-20 Thread Strainu
Thank you Andreas, that was exactly what I was looking for. Everybody
seems to agree there is more and more CC4 content out there, but
apparently not enough to justify the investment. Hopefully the
foundation will be able to provide more details and maybe a roadmap
for the following fiscal year.

Strainu

P.S. I just hope that we're not pushing technical debt along with this project.

În mie., 20 oct. 2021 la 17:26, Andreas Kolbe  a scris:
>
> The question about CC 4.0 was just answered in the "Conversation with the 
> Wikimedia Foundation Board of Trustees". This link will take you to the right 
> place:
>
> https://youtu.be/Zpof5J6jjZ4?t=3738
>
> Database rights were mentioned in the answers, along with challenging, 
> unforeseen technical requirements that would require more money, more effort 
> and more technical development work to address than was originally 
> anticipated. As a result, moving to 4.0 is not something the Foundation can 
> commit to doing right now.
>
> The question came up because the UN is apparently close to adopting CC 4.0 
> for its content.
>
> Andreas
>
> On Thu, Sep 30, 2021 at 9:01 AM Andreas Kolbe  wrote:
>>
>> See also 
>> https://meta.wikimedia.org/wiki/Terms_of_use/Creative_Commons_4.0/Diff
>>
>> The proposed waiver of database rights that was to accompany the move from 
>> 3.0 to 4.0 was one of the sticking points, I believe. To quote:
>>
>> Where you own Sui Generis Database Rights covered by CC BY-SA 4.0, you waive 
>> these rights. As an example, this means facts you contribute to the projects 
>> may be reused freely without attribution.
>>
>> For further background see
>>
>> https://meta.wikimedia.org/wiki/Terms_of_use/Creative_Commons_4.0/Legal_note#Waiving_database_rights
>>
>> Andreas
>>
>> On Thursday, September 30, 2021, Strainu  wrote:
>>>
>>> Hi Isaac,
>>>
>>> See https://meta.m.wikimedia.org/wiki/Terms_of_use/Creative_Commons_4.0
>>>
>>> Strainu
>>>
>>> Pe joi, 30 septembrie 2021, Isaac Olatunde  a 
>>> scris:
>>> > Hi Strainu,
>>> > I can't find the previous discussions.Could you please provide a link to 
>>> > the public consultations (or proposal) you mentioned to allow people on 
>>> > this list have a clear understanding of what was discussed?
>>> > Best regards
>>> > Isaac
>>> > On Wed, 29 Sep 2021, 23:22 Strainu,  wrote:
>>> >>
>>> >> Hi,
>>> >>
>>> >> A few years ago there was a public consultation on moving the Wikimedia 
>>> >> license to cc-by-sa-4.0 instead of 3.0. Now, obviously that never 
>>> >> happened but I couldn't find the decision documented anywhere. Why was 
>>> >> the proposal scrapped? Are there any plans to revisit this?
>>> >>
>>> >> Steainu ___
>>> >> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines 
>>> >> at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
>>> >> https://meta.wikimedia.org/wiki/Wikimedia-l
>>> >> Public archives at 
>>> >> https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/TF6DDMFO23Y3I4LJFITYYMOPHIIMWJAG/
>>> >> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
>
> ___
> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> Public archives at 
> https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/YAMJGXHYK67Y2GKF6WOFAYXFU7BEZ4KP/
> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/SCOM2YLTOCG7NY5QPQVLF2T3UA4HMGTR/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Maps-l] Any way to get contour data from wiki pages?

2021-10-20 Thread Strainu
Hi,

When opening some maps full screen, in addition to the pushpin, the
map also shows a contour for the item, even though the Wikidata item
does not have any associated geo data. I'm guessing that the data
comes from OSM through their Wikidata/Wikipedia tags. Example:
https://www.wikidata.org/wiki/Q4684917 (check the Romanian article for
a mapframe).

Is there a way to obtain that relation data into a Lua module? I'm
trying to determine the optimum zoom for an item and having its bbox
would help greatly.

Thank you,
   Strainu
___
Maps-l mailing list -- maps-l@lists.wikimedia.org
To unsubscribe send an email to maps-l-le...@lists.wikimedia.org


[Wikimedia-l] Re: Wikimedia Foundation Board of Trustees new resolution on branding

2021-10-20 Thread Strainu
So sorry to see yet another good idea go to waste. Hopefully the new
personalized approach will help some of us to fill the gap in notoriety
between Wikipedia and Wikimedia.

Strainu

Pe joi, 14 octombrie 2021, Shani Evenstein  a scris:
> Dear all,
>
> I am happy to share with you that the Wikimedia Foundation Board of
Trustees has passed a new resolution on the topic of branding [1].
>
> Some context
>
> As you may remember, last year the Board paused all work under the 2030
Movement Brand Project, in order to rethink and improve the Foundation’s
approach to community participation and decision making around renaming.
After year-long work, attentive listening and thoughtful conversation
between the Board, Wikimedia Foundation staff, and community advisors, the
committee has come up with a recommendation for next steps. The
recommendation was unanimously approved by the Board and captured in the
above mentioned resolution. The Wikimedia Foundation will therefore be
resuming its role to steward and protect Wikimedia brands, in partnership
with our broader movement, and the ad hoc Brand Committee concludes its
work .
>
> What are the main aspects of the resolution?
>
> Importantly, this resolution extends the Board’s decision that the
Wikimedia Foundation should not pursue renaming work for this fiscal year
(until at least July 2022). Instead, it directs the Foundation to support
the Wikimedia movement through three main areas of brand work that protect
and support Wikimedia’s reputation throughout the world. Please read more
about this decision on the Diff Blog [2].
>
> Next steps?
>
> Wikimedia Foundation teams intend to share more information on new
projects, including their plans for engaging our community, in the coming
weeks. In the meantime, Foundation staff and I are available to answer
clarifying questions on the Wikimedia brand / 2030 movement brand project
talk page on Meta [3]. You are also welcome to join the Board’s Open
Meeting on October 20th, where you will be able to ask questions and hear
from the team directly [4].
>
> Special thanks
>
> On behalf of the Board, I would like to thank the community advisors to
the Brand Committee. This group has worked with us since February 2021,
lending their time and expertise. Their input to the process has been
invaluable and we appreciate their commitment to help us find a productive
way forward. Thank you -- Lucy Crompton-Reid, Joao Alexandre Peschanski,
Megan Wacha, Justice Okai-Allotey, Rachmat Wahidi, Erlan Vega Rios, Richard
Knipel, Phoebe Ayers and Jeffrey Keefer!
>
> I would also like to thank our Brand Studio team at the Wikimedia
Foundation for their hard work, dedication, professionalism, flexibility,
openness, and vision they brought to our joint work on the future of
branding.
>
> Together, we made sure that the next steps for brand work are closely
connected to our 2030 strategic goals and we have no doubt they will be an
important service to the Wikimedia movement. I look forward to watching
these plans come to life and invite the community to actively participate
in these discussions and decisions as they unfold.
>
> Sincerely,
>
> Shani Evenstein Sigalov
>
> Chair, Brand Committee
>
> Board of Trustees, Wikimedia Foundation
>
> [1]:
>
>
https://foundation.wikimedia.org/wiki/Resolution:Next_Steps_for_Brand_Work,_2021
>
> [2]:
>
>
https://diff.wikimedia.org/2021/10/14/wikimedia-foundation-board-of-trustees-new-resolution-on-branding/
>
> [3]:
>
>
https://meta.wikimedia.org/wiki/Talk:Communications/Wikimedia_brands/2030_movement_brand_project
>
> [4]:
>
>
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Community_Affairs_Committee/2021-10-20_Conversation_with_Trustees
>
>
>
>
>
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/72M7H3RL4V2MPQG7YD7NH4NKGG7KMB5K/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Pywikipedia-bugs] [Maniphest] [Created] T293820: UploadError is not thrown to the client code when some warnings are ignored

2021-10-19 Thread Strainu
Strainu created this task.
Strainu added a project: Pywikibot.
Restricted Application added subscribers: pywikibot-bugs-list, Aklapper.

TASK DESCRIPTION
  Try to upload a duplicate of 
[[:ro:File:Fotoliu_(Artă_decorativă)_2784_11.10.2017_Fond_8764B28DD6F748CC984A8365A6CA66CF.jpg]],
 using the "Foto" prefix for the duplicate file, so it triggers the bad-prefix 
warning. In order for the upload to happen, I'm ignoring the bad-prefix 
warning, as shown by the (simplified) code below:
  
try:
success = imagepage.upload("/path/to/file",
ignore_warnings=['bad-prefix'],
chunk_size=0,
_file_key=None, _offset=0,
comment="Imagine Cimec nouă")
except pywikibot.exceptions.APIError as error:
if error.code == 'bad-prefix':
pywikibot.error('Upload error: Bad-prefix')
elif error.code == 'duplicate':
pywikibot.error('Upload error: Duplicate')
   return
except Exception:
pywikibot.error('Upload error: ', exc_info=True)
return

if success:
pywikibot.output('Success')
else:
pywikibot.output('Upload aborted.')
  
  The actual output is:
  
[UploadError("duplicate", "Uploaded file is a duplicate of 
['Fotoliu_(Artă_decorativă)_2784_11.10.2017_Fond_8764B28DD6F748CC984A8365A6CA66CF.jpg'].",
 {}), UploadError("bad-prefix", "Target filename has a bad prefix 
Fotoliu_(Artă_decorativă)_2784_11.10.2017_Fond_A8C631B7BB4A4D8CB3DD2DE4B52A21B0.jpg.",
 {})]
Upload aborted.
  
  Expected output would be to have the unignored error thrown to the user code:
  
[UploadError("duplicate", "Uploaded file is a duplicate of 
['Fotoliu_(Artă_decorativă)_2784_11.10.2017_Fond_8764B28DD6F748CC984A8365A6CA66CF.jpg'].",
 {}), UploadError("bad-prefix", "Target filename has a bad prefix 
Fotoliu_(Artă_decorativă)_2784_11.10.2017_Fond_A8C631B7BB4A4D8CB3DD2DE4B52A21B0.jpg.",
 {})]
Upload error: Duplicate

TASK DETAIL
  https://phabricator.wikimedia.org/T293820

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, pywikibot-bugs-list, Strainu, Jyoo1011, JohnsonLee01, SHEKH, 
Dijkstra, Khutuck, Zkhalido, Viztor, Wenyi, Tbscho, MayS, Mdupont, JJMC89, 
Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
___
pywikibot-bugs mailing list -- pywikibot-bugs@lists.wikimedia.org
To unsubscribe send an email to pywikibot-bugs-le...@lists.wikimedia.org


[Wikimedia-l] Re: 100$ million dollars and still obsolete

2021-10-16 Thread Strainu
Galder,

I want to start by saying that I totally understand your frustration. I
have encountered the Great circle of excuses many times and especially no 6
was used to break many good discussions when they became difficult for the
staff. There was a time when I honestly felt the engineers at the WMF were
grossly overpaid for what they delivered. But things have changed.
Communication about new projects has never been this good and there are a
lot of projects going around from both wmf and wmde.

Yes, things are not 100% smooth and I believe no one agrees with all the
different changes happening, but we have to keep a cool head and understand
that we (community+staff) need to work at planet scale and try to keep a
balance between all users, regardless of background or age.

Christophe has nailed the high-level problems IMHO, and to answer your
question, we are all responsible for the strategy. I'm not just throwing
buzz words around: each community knows its readership and has at least
some intuition into what makes it happy. This can be evolved by asking for
support from the research team and then trying to implement them or bring
them to the WMF road map. This last step is the most difficult, but things
like the community wishlist or project grants are some of the tools at our
disposal to make our wishes happen. Of course, engaging and/or challenging
the WMF also might work. I would personally like to see an office hour with
the C-level person in charge of product (I don't even know who that is
anymore) where such high-level issues could be brought to the table.

Specifically, for the looks issue, not all the wikis need to look the same.
Some stuck with monobook for a number of years, other are using the
Timeless skin. If the current team is not implementing your wishes, why not
look for other ways to improve at least your home wiki?

Have a great weekend,
Strainu

Pe sâmbătă, 16 octombrie 2021, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a scris:
> Thanks Christophe,
> And whose responsibility is to answer to "And if you read the whole
thread it is not really about money but more about product
vision/strategy/roadmap :)"? Who should have this strategy, vision and
roadmap?
>
> That's the x in this equation.
> Galder
> 
> From: Christophe Henner 
> Sent: Saturday, October 16, 2021 9:33 AM
> To: Wikimedia Mailing List 
> Subject: [Wikimedia-l] Re: 100$ million dollars and still obsolete
>
> Hi,
> I will the whole first part of the discussion :)
> As for the product discussion. We should very mindful of what we consider
our ProductS.
> We tend to talk a lot about the wikis. They are products that can be
improved, and have been and still should evolve yes. And I agree it would
be great if they improved more, be updated for both readers and editors.
But the context, with so many communities to satisfy makes it very hard.
> Be damned if you do, be damned if you don't sort of things.
> But, they are not obsolete.
> What however is, to me, obsolete is our shared very occidental web vision
of our products.
> What can makes us obsolete, is our inability to adapt our products or
create new products adapted to new mean of content consumption.
> From a content consumption perspective, video and audio have a lot of
tractions.
> Short and fast burst of information is taking more and more place on how
we consume content.
> The disintermediation of content is more than here and even if we have
Wikidata, we are not, yet!, exploiting it's full potential to spread
content.
> VR and AR are 5 to 10 years away as mass market products. But it will
requires years to do something good for us around it.
> Yes editing can be improved, but to me it is not where we will see
obsolescence first. Content consumption is clearly to me the topic.
> I know it can be easy to say "hey look at simultaneous editing on gdoc or
365". Yes that's a nice thing, but would it be a game changer for us? But
having all around the world PoP to decrease loading time also is a great
product improvement. Etc.
> All that to say, yes there is a lot of work from a product perspective,
but it can be easy to have our own biases give us a twisted view of what
needs to be improved.
> And if you read the whole thread it is not really about money but more
about product vision/strategy/roadmap :)
> Which we might be missing or isn't known enough.
> Le sam. 16 oct. 2021 à 8:41 AM, Galder Gonzalez Larrañaga <
galder...@hotmail.com> a écrit :
>
> True Samuel. We can actually edit [Wikipedia] from our mobile phones. We
can't use the visual editor. I tried to say it later with the sentence
"Desktop computers are disappearing. We still can't edit in a good way with
our mobile phones." but it's true the first time I mentionen this it was
not factual.
>
> About the other projects, it doesn't matter

[Wikimedia-l] Re: What happened to the cc-by-sa-4.0 initiative?

2021-09-29 Thread Strainu
Hi Isaac,

See https://meta.m.wikimedia.org/wiki/Terms_of_use/Creative_Commons_4.0

Strainu

Pe joi, 30 septembrie 2021, Isaac Olatunde  a
scris:
> Hi Strainu,
> I can't find the previous discussions.Could you please provide a link to
the public consultations (or proposal) you mentioned to allow people on
this list have a clear understanding of what was discussed?
> Best regards
> Isaac
> On Wed, 29 Sep 2021, 23:22 Strainu,  wrote:
>>
>> Hi,
>>
>> A few years ago there was a public consultation on moving the Wikimedia
license to cc-by-sa-4.0 instead of 3.0. Now, obviously that never happened
but I couldn't find the decision documented anywhere. Why was the proposal
scrapped? Are there any plans to revisit this?
>>
>> Steainu ___
>> Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines
at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
>> Public archives at
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/TF6DDMFO23Y3I4LJFITYYMOPHIIMWJAG/
>> To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/M4RCULYAUC5B57AA767QBINUEXD7ZWS2/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Wikimedia-l] What happened to the cc-by-sa-4.0 initiative?

2021-09-29 Thread Strainu
Hi,

A few years ago there was a public consultation on moving the Wikimedia
license to cc-by-sa-4.0 instead of 3.0. Now, obviously that never happened
but I couldn't find the decision documented anywhere. Why was the proposal
scrapped? Are there any plans to revisit this?

Steainu
___
Wikimedia-l mailing list -- wikimedia-l@lists.wikimedia.org, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
Public archives at 
https://lists.wikimedia.org/hyperkitty/list/wikimedia-l@lists.wikimedia.org/message/TF6DDMFO23Y3I4LJFITYYMOPHIIMWJAG/
To unsubscribe send an email to wikimedia-l-le...@lists.wikimedia.org

[Talk-ro] Erori de adrese în MapRoulette

2021-09-25 Thread Strainu
Salut,

Dacă sunt printre voi utilizatori de MapRoulette, am încărcat peste
5000 de erori (majoritatea sunt de fapt date lipsă) în adresele din
România. Urmează să mai încarc o serie (probabil mai mare) în același
challenge.

Puteți vedea taskurile la https://maproulette.org/browse/challenges/23100

Strainu

___
Talk-ro mailing list
Talk-ro@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk-ro


[pywikibot] Bots and tools need to upgrade to Pywikibot 6.6.1

2021-09-23 Thread Strainu
Is this important enough to backport to older versions maybe?

I also don't upgrade often and it does take a while to fix automatic bots,
but I'm not a fan of carrying legacy code (especially for something with
potential security implications) for decades. 7 years sounds enough :)

Strainu


Pe joi, 23 septembrie 2021, masti  a scris:
> one thing that falls to my mind:
> there are a lot of changes and breaking changes. But please keep in mind
tha ta lot of bots are run by people that do it in their free time. They
are not focused on doing that as their primary job. Please try to keep
pywikibot updates in a way that allows for them to keep up.
>
> I have bots that run on very outdated code and they do run fine. That
means that forcing updates is not necessary.
>
> masti
>
> On 9/22/21 8:53 PM, Kunal Mehta wrote:
>>
>> Hi everyone,
>>
>> Bots and tools using Pywikibot must upgrade to version 6.6.1[1]
otherwise they will break when deprecated API parameters are removed[2]. If
you have any questions or need help in upgrading, please reach out using
one of the Pywikibot communication channels[3].
>>
>> [1] https://doc.wikimedia.org/pywikibot/stable/changelog.html
>> [2]
https://www.mediawiki.org/wiki/MediaWiki_1.37/Deprecation_of_legacy_API_token_parameters
>> [3]
https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:Pywikibot/Communication
>>
>> Thanks,
>> -- Legoktm
>> ___
>> pywikibot mailing list -- pywikibot@lists.wikimedia.org
>> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org
>
> ___
> pywikibot mailing list -- pywikibot@lists.wikimedia.org
> To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org
>
___
pywikibot mailing list -- pywikibot@lists.wikimedia.org
To unsubscribe send an email to pywikibot-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T279069: Confusing Lua error "Too many Wikidata entities accessed"

2021-04-01 Thread Strainu
Strainu updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T279069

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, Strainu, Invadibot, maantietaja, Akuckartz, Nandana, lucamauri, 
Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, 
Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] T279069: Confusing Lua error "Too many Wikidata entities accessed"

2021-04-01 Thread Strainu
Strainu created this task.
Strainu added a project: MediaWiki-extensions-WikibaseClient.
Restricted Application added a subscriber: Aklapper.
Restricted Application added a project: Wikidata.

TASK DESCRIPTION
  This page 
<https://ro.wikipedia.org/w/index.php?title=Wolffs_Revier=10492681> 
throws a "Too many Wikidata entities accessed" because the P161 
<https://phabricator.wikimedia.org/P161> (distribution) property has hundreds 
of items and the Infobox tries to access them all. While the error is correct, 
when checking the parser data in preview, I see "//Number of Wikibase entities 
loaded9/400//", which is confusing.
  
  I believe there are 2 improvements than can be made:
  
  1. in the error message, explicitly mention the limit and the current value
  2. correctly report the number in the parser performance data

TASK DETAIL
  https://phabricator.wikimedia.org/T279069

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, Strainu, Invadibot, maantietaja, Akuckartz, Nandana, lucamauri, 
Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, 
Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikitech-l] Maps Modernization plan - FYI

2021-03-22 Thread Strainu
Hi Erica,

Thanks for the announcement, I'm glad to see some love given to Maps. Could
you explain how this initiative interacts with the planned improvements
from WMDE [3]?

Thank you,
   Strainu

[3] https://meta.wikimedia.org/wiki/WMDE_Technical_Wishes/Geoinformation


În lun., 22 mar. 2021 la 19:11, Erica Litrenta  a
scris:

> Greetings,
>
> This is a follow up from our last email some months ago (and a crosspost).
> You may already have seen today's announcement from Legal about the
> upcoming changes to the Maps Terms of Use. Here is an extra heads-up that 
> Wikimedia
> Maps are transitioning towards a more modern architecture. The first phase
> of this transition will be replacing Tilerator [0] with Tegola [1] as our
> vector tile server. This is a change in the Maps infrastructure, so there
> should be little to no impact to the end users’ experience.
>
> It is important that we are able to provide software that is sustainable
> to support, before we can guarantee a reliable user experience. Wikimedia
> Maps aim to provide Wikimedia users a consistent experience contributing to
> and learning about geoinformation. To achieve this goal, we will empower
> those engineers maintaining the Wikimedia Maps infrastructure to do so with
> ease and low effort.
>
> If you want to learn more, please head to mediawiki.org [2], where you
> will also find a Questions & Answers section.
>
> Thanks, and take care,
>
> Erica Litrenta (on behalf of the Product Infrastructure team)
>
> [0] https://wikitech.wikimedia.org/wiki/Maps/Tilerator
>
> [1] https://tegola.io/
> [2] https://www.mediawiki.org/wiki/Wikimedia_Maps/2021_modernization_plan
>
> --
>
> --
>
>
> Erica Litrenta (she/her)
>
> Manager, Community Relations Specialists
>
> Wikimedia Foundation <https://meta.wikimedia.org/wiki/User:Elitre_(WMF)>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikimedia-l] Grants video tutorials no longer available

2021-03-11 Thread Strainu
Hey folks,

I'm not sure where should I direct my question, but the video grant
tutorials linked from
https://meta.wikimedia.org/wiki/Grants:Project/Tutorial seem to be
offline since at least early 2020.
Is there any way to retrieve them?


Strainu

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Talk-ro] Road network improvements in Romania

2021-02-27 Thread Strainu
Hi Murad,

În addition to the suggestions you already received, I would like to add the 
request to make sure your team is properly trained in the general map editing 
rules. 

Last year we had a list of edits marked with the hashtag #bolt that were going 
against the "no duplication" rule (ex: 
https://www.openstreetmap.org/changeset/79368670). We would very much 
appreciate if we don't need to make consistent corrections to your edits. 
However, we are aware that mistakes happen and we hope that your team will be 
responsive to requests for corrections we send to Bolt 001 (per the wiki you 
provided)

Strainu

În 25 februarie 2021 13:31:53 EET, Murad Vardzelyan via Talk-ro 
 a scris:
>Hi there,
>
>I’m Murad from Bolt.
>
>We are going to start improving the Romanian road network. Project
>details
>can be found here:
>https://wiki.openstreetmap.org/wiki/Organised_Editing/Activites/Bolt.
>We
>would like to get familiar with local mapping guidelines if there are
>any.
>
>Please, also, let me know if you have preferred contact channels other
>than
>this talk-list.
>
>Waiting for your reply.
>
>Regards, Murad from Bolt Map Production

-- 
Trimis de pe dispozitiv Android cu K-9 Mail. Rog scuzați mesajul scurt.___
Talk-ro mailing list
Talk-ro@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk-ro


[Wikidata-bugs] [Maniphest] T274030: Mismatch between field value and value seen by code when adding images

2021-02-06 Thread Strainu
Strainu updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T274030

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, Strainu, Akuckartz, darthmon_wmde, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Wikidata-bugs, aude, Lydia_Pintscher, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] T274030: Mismatch between field value and value seen by code when adding images

2021-02-06 Thread Strainu
Strainu created this task.
Strainu added projects: Wikidata, Wikibase.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  Try to add an image to a Wikidata item without a mouse:
  
  - go to "add a statement" and click it
  - in the property field select P18 <https://phabricator.wikimedia.org/P18> 
(image)
  - in the filename field, add a commons file name with namespace ("File:Troita 
autentica.JPG")
  - without selecting any of the suggestions, press Enter
  
  At this point, 2 (expected) things happen:
  
  - the user gets an error message about the filename containing invalid 
characters
  - the namespace is removed automatically
  
  The behavior so far is acceptable, although ideally I would have expected to 
be able to save the value if the namespace has been removed. However, if I 
press Enter again, the "publish" link behaves as if it has been pressed again 
but the error does not disapear (I expect that actually another error is 
generated, not quite sure).
  
  What should happen:
  
  - preferred: the namespace is removed on paste or the first thing when 
pressing Enter and the value is saved from the first try
  - acceptable: the page is save when pressig Enter the second time

TASK DETAIL
  https://phabricator.wikimedia.org/T274030

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Strainu
Cc: Aklapper, Strainu, Akuckartz, darthmon_wmde, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Wikidata-bugs, aude, Lydia_Pintscher, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikimedia-l] Thanks for all the fish! / Stepping down April 15

2021-02-05 Thread Strainu
Thank you for all your work Katherine! You will be missed.

Strainu

În joi, 4 feb. 2021 la 19:48, Katherine Maher  a
scris:

> Hi everyone,
>
> Earlier today, I announced to my colleagues at the Wikimedia Foundation my
> intention to step down as CEO later this spring. April 15th will be my last
> day, marking my seven-year anniversary with the Foundation and the
> movement. This was not an easy decision, but it is the right one. For now,
> I want to share with you why I’m moving on, and what comes next. I’ll save
> the customary email with deeper reflections, memories, and thanks for later
> this spring!
>
> In some ways, this was the easiest hard decision I’ve ever made. It’s
> never exactly a good time to step away -- transitions always have some
> rough edges -- but it’s always best to do so when the organization is
> strong, and before you’ve overstayed your welcome. The movement is in a
> good, strong place. Our communities are growing, our readership is too. Our
> 20th birthday, the launch of our Universal Code of Conduct, and the
> movement strategy recommendations are all milestone moments of solidity and
> strength. I have great hopes and confidence in the upcoming plans for
> strategy implementation, particularly the work on the movement charter and
> interim global council. We are healthy and thriving.
>
> While we will always have more work to do to become the Wikimedia that we
> want to be, our movement and our organization is in a phase of renewal and
> regeneration. We have deepened our practices of consultation,
> collaboration, and inclusion that will be the foundation of the next decade
> of our work. We have a deep and stable financial position that will help us
> grow and protect us from any storm, and the trust in our projects has never
> been higher. Our communities are poised to take on deeper responsibilities
> of governance, accountability, and leadership, populating a rich,
> representative, and leaderful movement for free knowledge.
>
> The Foundation is also strong, and filled with passionate, values-aligned
> leaders at every level of the organization, deeply committed to the work of
> our movement and mission. Although we don’t always all perfectly agree on
> absolutely everything, we are working more openly and cooperatively with
> our movement than ever before. Collaborative strategic planning,
> sustainable programs to support technical communities and tooling,
> co-development and consultation on transformative new experiences welcoming
> newcomers, cooperative partnerships on public health data, bibliographic
> data, and human rights data -- all of these are signals of much great work
> to come. Even difficult topics, such as brand and movement governance,
> continue to bring people together in nothing less than feisty commitment.
>
> Together, we have rich resources of brilliant people, deep passion, and
> compassion. We are making progress on some of our greatest challenges, from
> editor and readership growth, technical debt, representation and
> participation, safety and knowledge equity. I am proud of what we’ve done
> together and grateful for all the ways in which this movement has made my
> life immeasurably richer: friendships that will last a lifetime,
> intellectual curiosity and kinship, and so many memories of *so much
> dancing*, from Accra to Berlin to Chandigarh.
>
> As for me, I’m going to take a break, and a research fellowship, as a
> place to think about what’s next. It’s hard to think about your future when
> you’re fully in your present, and for the past seven years, I’ve been fully
> present for this movement. But as I look around, I see global challenges
> such as polarization, inequality, and climate change, as well as
> opportunities for generational renewal and optimism. As a Wikimedian, I
> lean toward optimism, and plan to apply myself in that direction!
>
> *What’s next*
>
>- We announced this planned transition publicly on our communications
>channels during a Foundation all-staff meeting today.
>- A Board Transition Committee composed of Dariusz Jemielniak, who is
>chair of HR Committee, Tanya Capuano, who is chair of the Audit Committee,
>Raju Narisetti, and María Sefidari as Board Chair, will launch the search
>for a new CEO. They’ll work closely with the executive Transition Team on
>organizational operations, and with the broader board on an open candidate
>call. The Board is working with the goal of onboarding a new CEO by Q2 of
>the 2021-2022 fiscal year.
>- We’ve been working on succession planning for the CEO role since
>2019 as a matter of best practice, and the organization is well-prepared
>for a thoughtful search for the next phase of our mission. The B

Re: [Wikitech-l] The future of UploadWizard

2021-02-04 Thread Strainu
În joi, 4 feb. 2021 la 21:31, Ostrzyciel Nożyczek
 a scris:
> The things that I have on mind are:
>
> Rework config handling to make it more consistent (now only campaign configs 
> are parsed, the main config is not) and robust (unit testing included!).
> Simplify the task of including more licenses in UW (message loading based on 
> config), add more built-in icons to make that even simpler for site admins.
> Change the tutorial from an image to wikitext, which should be much easier to 
> edit.
> Restructure documentation to be third-party centric, maybe make a brief 
> configuration guide (configuring UW now requires one to carefully study a 
> not-so-friendly PHP file).
> Add a few quick hacks to make the UI responsive, at least to some degree 
> (that is very much possible with just CSS). The solution can be polished 
> later.
> Remove Wikibase-related code and other Wikimedia-specific stuff that will 
> just make testing harder.
> Improve configurability for all fields in the wizard, ensure sensible default 
> settings.
> Add an option to use single-language fields. Multi-language fields are 
> unnecessary on most wikis.
> Look into how different stages of UW could be streamlined / improved to make 
> the upload process faster, especially on wikis requiring less detailed 
> information.
> Make all kinds of file description syntax configurable.
> (Maybe) prepare and package a few ready-to-use configuration sets, together 
> with the templates necessary to make it work. That would really simplify the 
> process of bringing UW to a wiki.

Just a quick note to say that out of the 11 items you list above, 8
would also improve the Wikimedia experience :)

Strainu

>
> ...and more! This may be a bit ambitious, but I think it's doable with just a 
> few people interested in the project and some spare time. I am certainly on 
> board. :P
>
>
> --
> Ostrzyciel (he/him)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The future of UploadWizard

2021-02-04 Thread Strainu
În joi, 4 feb. 2021 la 16:54, Bartosz Dziewoński  a scris:
>
> On 2021-02-03 23:33, Strainu wrote:
> > One thing that puzzles me in that ticket is this phrase from Mark
> > Traceur: "It might be better to look at something (slightly) more
> > modern, like the upload dialog in core". Does anyone know what that
> > dialog is? AFAIK the uploader in core (Special:Upload) hasn't changed in
> > decades, except maybe for the look of the buttons. Its usability is
> > rubbish compared to UW. Wikis used to (no, actually they still do)
> > customize it using the uselang param,which messes with the user's
> > settings. I can't really understand how that would be better...
>
> The upload dialog is this: https://www.mediawiki.org/wiki/Upload_dialog
>
> It's accessible from both the visual and wikitext editors (unless you
> disabled the toolbar), though their dialogs to insert image thumbnails.

Thanks to all who enlightened me. :)

We're basically talking cross-wiki uploader here in the Wikimedia
world (although I'm sure it can be used for other things). I agree
with Ostrzyciel's assessment that it lets anyone upload anything -
that's what prompted the request to disable cross-wiki uploads in the
first place. The UW, in collaboration with campaigns, remains the most
powerful web uploader the Wikimedia community currently has.

Strainu
>
> --
> Bartosz Dziewoński
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The future of UploadWizard

2021-02-03 Thread Strainu
As the deafening silence of this thread probably shows, a discussion is not
really possible. The WMF has had 0 interest in making uploads easier in the
last few years.

To be fair, faced with furios opposition from the Commons community for
even basic improvements such as allowing imports from other sites except
Flickr and requests to stop cross-wiki uploads, this decision does not seem
out of place.

As one of the few people that has enabled UW in another Wikimedia wiki, I
would like to encourage you to follow on your plan to improve the wizard as
much as possible. Plans at the WMF change often and not necessarily for the
better. A responsive design would be awsome news for wikis that need to
guide their users through the mess that is freedom of panorama.

One thing that puzzles me in that ticket is this phrase from Mark Traceur: "It
might be better to look at something (slightly) more modern, like the
upload dialog in core". Does anyone know what that dialog is? AFAIK the
uploader in core (Special:Upload) hasn't changed in decades, except maybe
for the look of the buttons. Its usability is rubbish compared to UW. Wikis
used to (no, actually they still do) customize it using the uselang
param,which messes with the user's settings. I can't really understand how
that would be better...

Andrei

Pe duminică, 31 ianuarie 2021, Ostrzyciel Nożyczek <
ostrzycielnozyc...@gmail.com> a scris:

> Hi,
>
> I would like to uhhh... start the discussion? ask for opinions? about the
> future of UploadWizard.
>
> It is a rather special extension, that was from the start made mostly for
> Commons' very specific needs and getting it to work anywhere else presents
> some challenges (some of which I attempt to tackle here
> ). Interestingly, it still is
> used by many third-party wikis
>  and although some
> of them don't need its full set of capabilities related to describing
> licenses, authors and sources, there are wikis that do need that. The wiki
> I maintain, Nonsensopedia, has a Commons-like file description system based
> on Semantic MediaWiki (see example here
> ) and
> UploadWizard has been a *blessing* for us, greatly simplifying the task
> of file moderation.
>
> Opinion time: Wikis should be *encouraged* to properly describe the
> authorship of files that they use, to meet the licensing requirements. IMO
> Wikimedia Foundation as the maintainer of MediaWiki and a foundation
> dedicated to dissemination of free culture should provide a usable tool
> for properly describing free multimedia. UploadWizard could be just that.
>
> At the same time, the extension has been basically unmaintained
>  since the Multimedia
> team was dissolved and I've been rather surprised to discover that patches
> improving third-party support were met with uhm... very limited
> enthusiasm?  There are
> a few obvious features lacking like mobile support (seriously, try opening
> https://commons.wikimedia.org/wiki/Special:UploadWizard on a narrow
> screen device, it's been like this since.. always) and configurability (you
> have to jump through some serious hoops
>  to just add a
> license; customizing the tutorial is similarly hard).
>
> I've been thinking of what to do with the above and I really wouldn't want
> to embark on something that will be rendered redundant or obsolete in a
> year, so my question is: are there any plans for UploadWizard? What makes
> me suspect that things may change is primarily Structured Data on Wikimedia
> Commons, which in the future will maybe (?) supersede the description
> system around the {{Information}} template. Are there any rough roadmaps or
> outlines of anything resembling a plan for that? If Commons was to
> implement full, structured file descriptions in the upload tool, that code
> would be probably hardly usable outside Commons, given that Wikibase is not
> something easy to install or maintain, it is also awfully overkill for the
> vast majority of applications. In such a situation, would it make sense to
> consider completely separating the "Wikimedia Commons Shiny Upload Tool"
> from a more general extension that would be usable for third
> parties, stripped of any Commons-specific code? A lot of things could be
> much simplified if the extension was to target just the needs of third
> parties and not Commons.
>
> I ask about this because I really don't see any sort of interest of the
> extension's *de facto* owner (and that is WMF) in developing it, there
> are also no public plans for it, as far as I know. Yes, I can make a fork
> anytime, but first I'd prefer to know if I'm not missing something. Well,
> actually, I already did make a fork of UW
> 

[Wikimedia-l] Wikimedians of Romania and Moldova report for 2020

2021-01-07 Thread Strainu
Hi folks,

WMROMD is happy to share its activity report with the community:
https://meta.wikimedia.org/wiki/Wikimedians_of_Romania_and_Moldova_User_Group/2020

We're looking forward for your comments here or on the talk page.

Strainu

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikitech-l] The watchlist queue does not work recently

2020-12-29 Thread Strainu
În mar., 29 dec. 2020 la 23:46, Andre Klapper  a scris:
>
> On Tue, 2020-12-29 at 18:37 +, Tom Doles via Wikitech-l wrote:
> Yeah, I also had to deal with such unhelpful responses before.
>
> I'm sorry to hear that. The process of gathering sufficient info in
> tickets can unfortunately sometimes be confusing, surprising, or
> frustrating, given the many technical Wikimedia areas and complexity.
>
> > Some advice to avoid this:
>
> Customer service rule no. 1: listen to the customer. It's not helpful
> to argue and I'd assume that's not what Andre's employer expects from
> him.
>
>
> There might be a misunderstanding:

Not a misunderstanding, more like a difference in the chosen meaning
of the term. I think what Tom was suggesting is that the bugwrangler,
just like anyone doing any commercial activity in this world, has
customers to which they provide a service - in this case, the services
described at [[:mw:Bugwrangler]] [0].and specifically the following
line: "Work with members of the community who report bugs to clarify
any ambiguity in the bug descriptions and get all the information
required to reproduce the bugs", For the purposes of this thread, your
customers are the members of the community which take from their time
to report bugs.

The "rules" described in Tom's email are good practices that can be
encountered, under different forms, in many companies' core values.
[1]  While WMF does not put them this high, I would not dismiss them
as "not my job". My suggestion would be to ask for more constructive
feedback instead:
* Why is the "How to report a bug" page not helpful?
* Where do you need more info?
* How can the bugwrangler help more while keeping in mind he needs to
scale his methods to hundreds or thousands of bug reporters every
month?

[0] As a sidenote, I personally find that page to be comprehensive and
providing an appropriate level of detail
[1] https://builtin.com/company-culture/company-core-values-examples

>
> Phabricator is an issue tracker where people interact in their many
> different roles (readers, editors, developers, managers, translators,
> document writers, designers, etc etc etc). Anyone can report Wikimedia
> related technical issues there, by following
> https://www.mediawiki.org/wiki/How_to_report_a_bug
>
> If you are looking for a 'customer service' support venue, then you may
> want to check https://www.mediawiki.org/wiki/Project:Support_desk for
> MediaWiki, or https://meta.wikimedia.org/wiki/Tech for tech issues on
> Wikimedia wikis, or contact the OTRS mail queues.
>
>
> Regarding [part of] my work, https://www.mediawiki.org/wiki/Bugwrangler
> tries to outline some duties. (As I was explicitly mentioned.)
>
> Hope that helps a bit. :)

Same here :)

Strainu

>
> Cheers,
> andre
>
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The watchlist queue does not work recently

2020-12-29 Thread Strainu
Pe marți, 29 decembrie 2020, Andre Klapper  a scris:

> On Tue, 2020-12-29 at 20:14 +0200, יגאל חיטרון wrote:
> > Sure. As you said, this very link requires full reproduction steps.
>
> No. It says "Full details of the issue, giving as much detail as possible."


That's just... Confusing. I can totally understand why someone would feel
discouraged from logging an issue. How about "Full details of the issue,
giving all the information you currently have. If that is insufficient you
will be asked for additional information along with guidance on how to
obtain it."?

An example of what "minimized steps" means might also be a good idea.

Strainu


> andre
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What is JSON (in JavaScript code)?

2020-10-30 Thread Strainu
Thanks Dan and Roy, apparently our TW was overriding JSON for some
reason. Fixed it.

Strainu

În vin., 30 oct. 2020 la 17:20, Roy Smith  a scris:
>
> JSON is Java Script Object Notation.  It's a way of encoding structured data 
> as text strings which originated (as in name implies) in javascript, but is 
> now widely used as a data exchange format, with support in nearly every 
> programming language.  https://www.w3schools.com/js/js_json.asp
>
> But, in the context you're using it, it's a library of JSON parsing and 
> encoding functions built into the javascript implementation on most browsers. 
>  https://www.w3schools.com/Js/js_json_parse.asp
>
> If you've opened your browser's console, you should be able to type JSON at 
> it and get back something like:
>
> JSON
> JSON {Symbol(Symbol.toStringTag): "JSON", parse: ƒ, stringify: ƒ}parse: ƒ 
> parse()arguments: (...)caller: (...)length: 2name: "parse"__proto__: ƒ 
> ()[[Scopes]]: Scopes[0]stringify: ƒ stringify()Symbol(Symbol.toStringTag): 
> "JSON"__proto__: Object
>
>
> If you get something like "JSON is not defined", you're probably running an 
> ancient browser.
>
>
>
>
> On Oct 30, 2020, at 11:05 AM, Strainu  wrote:
>
> Hi,
>
> I'm looking at solving the following console warning on ro.wp:
> "JQMIGRATE: jQuery.parseJSON is deprecated; use JSON.parse" which
> appears due to outdated Twinkle code. Just making the replacement does
> not work, since JSON is not defined. As a matter of fact, I cannot
> find it anywhere else in the code loading on a normal Romanian
> Wikipedia page.
>
> Alas, the generic name of that object makes searching on mw.org or
> Google rather useless. I can see some similar changes in Phabricator,
> but they seem to work.
>
> So, what is JSON and how can I use it in my code?
>
> Thanks,
>   Strainu
>
> P.S. Please don't suggest updating Twinkle...
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] What is JSON (in JavaScript code)?

2020-10-30 Thread Strainu
Hi,

I'm looking at solving the following console warning on ro.wp:
"JQMIGRATE: jQuery.parseJSON is deprecated; use JSON.parse" which
appears due to outdated Twinkle code. Just making the replacement does
not work, since JSON is not defined. As a matter of fact, I cannot
find it anywhere else in the code loading on a normal Romanian
Wikipedia page.

Alas, the generic name of that object makes searching on mw.org or
Google rather useless. I can see some similar changes in Phabricator,
but they seem to work.

So, what is JSON and how can I use it in my code?

Thanks,
   Strainu

P.S. Please don't suggest updating Twinkle...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Talk-ro] Referinte invalide in relatii

2020-09-09 Thread Strainu
Salut,

Mai sunt inadvertențe în date, dar la o primă vedere way id-ul ăla e
corect. Cum identifică codul o referință invalidă?

Strainu

În mie., 9 sept. 2020 la 21:48, Gyula Szabo  a scris:

> Salut,
> am inceput un proiect privat cu datele OSM, si in timpul citirii datelor
> am vazut niste referinte invalide in exportul generat (osm.xml).
>
> Este acest lucru normal in exporturi sau in baza de date?
>
> Exportul:
> N46.0E23.0 - N47.0E24.0
> 
> The data included in this document is from www.openstreetmap.org.
> The data is made available under ODbL.
> 
>
> Un exemplu de referinta invalida (exista 1613 astfel de loguri):
> Invalid Way id: 760818722 in relation: 7958145
>
> In cazul in care ajuta pot trimite codul Java sau logul complet.
> Mersi,
> Gyula
>
>
>
>
>
>
> <http://www.avg.com/email-signature?utm_medium=email_source=link_campaign=sig-email_content=webmail>
>  Virus-free.
> www.avg.com
> <http://www.avg.com/email-signature?utm_medium=email_source=link_campaign=sig-email_content=webmail>
> <#m_7410420102117103768_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
> ___
> Talk-ro mailing list
> Talk-ro@openstreetmap.org
> https://lists.openstreetmap.org/listinfo/talk-ro
>
___
Talk-ro mailing list
Talk-ro@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk-ro


Re: [Wikitech-l] [Wikimedia-l] Wikimedia Chat

2020-08-30 Thread Strainu
În dum., 30 aug. 2020 la 03:00, Amir Sarabadani  a scris:
>
> Hello,
> Due to the current situation, there are more and more collaborations
> happening online instead. and now you can see Wikimedia-related discussion
> groups in Slack, Discord, Telegram, Facebook, and many more. Besides being
> scattered and inaccessible to people who don't have accounts in those
> platforms (for privacy reasons for example), these platforms use
> proprietary and closed-source software, are outside Wikimedia
> infrastructure and some harvest our personal data for profit.

Hey Amir,

Please take this email as positive feedback, even if it might not
sound like it :)

As much as I value software freedom and my personal data, I've learned
during the years that the best conversations happen where people
converge naturally, not where one wants them to be. What you describe
below is an awesome list of features ... that already exist elsewhere.
I could give you an equally long list of things that are missing, but
individually, none of them matter. What matters is which platform most
people choose, based on which of the features are important for them.
And that platform might be different for different projects.

What that means is that Wikimedia Chat will be just another name in
that long list of apps that people choose to use or not. It's fine if
you want to maintain it, it's great if it will gain traction, but
don't be too upset if it will have the same usage as Wikimedia Spaces.

Strainu
>
> IRC on freenode is a good alternative but it lacks basic functionalities of
> a modern chat platform. So we created Wikimedia Chat, a mattermost instance
> in Wikimedia Cloud. Compared to IRC, you have:
> * Ability to scrollback and read messages when you were offline
> * Push notification and email notification
> * You don't need to get a cloak to hide your IP from others
> * Proper support for sharing media
> * Two factor authentication
> * A proper mobile app support
> * Ability to add custom emojis (yes, it's extremely important)
> * Profile pictures
> * Ability to ping everyone with @here
> * much much more.
>
> You can use Wikimedia Chat by going to https://chat.wmcloud.org, anyone can
> make an account. This is part of Wikimedia Social suite [1], the oher
> similar project is "Wikimedia Meet". [2]
>
> Some notes:
> * This is done in my volunteer capacity and has been maintained by a group
> of volunteers. If you're willing to join the team (either technical or
> enforcing CoC, kicking out spammers, other daily work), drop me a message.
> * Privacy policy of Wikimedia Cloud applies: https://w.wiki/aQW
> * As a result, all messages older than 90 days get automatically deleted.
> * As a Wikimedia Cloud project, all of discussions, private and public are
> covered by Code of conduct in technical spaces:  https://w.wiki/AK$
>
> Hope that would be useful for you, if you encounter any technical issues,
> file a bug in the phabricator.
>
> [1] https://meta.wikimedia.org/wiki/Wikimedia_Social_Suite
> [2] https://meta.wikimedia.org/wiki/Wikimedia_Meet
>
> Best
> --
> Amir (he/him)
> ___
> Wikimedia-l mailing list, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikimedia-l] Wikimedia Chat

2020-08-30 Thread Strainu
În dum., 30 aug. 2020 la 03:00, Amir Sarabadani  a scris:
>
> Hello,
> Due to the current situation, there are more and more collaborations
> happening online instead. and now you can see Wikimedia-related discussion
> groups in Slack, Discord, Telegram, Facebook, and many more. Besides being
> scattered and inaccessible to people who don't have accounts in those
> platforms (for privacy reasons for example), these platforms use
> proprietary and closed-source software, are outside Wikimedia
> infrastructure and some harvest our personal data for profit.

Hey Amir,

Please take this email as positive feedback, even if it might not
sound like it :)

As much as I value software freedom and my personal data, I've learned
during the years that the best conversations happen where people
converge naturally, not where one wants them to be. What you describe
below is an awesome list of features ... that already exist elsewhere.
I could give you an equally long list of things that are missing, but
individually, none of them matter. What matters is which platform most
people choose, based on which of the features are important for them.
And that platform might be different for different projects.

What that means is that Wikimedia Chat will be just another name in
that long list of apps that people choose to use or not. It's fine if
you want to maintain it, it's great if it will gain traction, but
don't be too upset if it will have the same usage as Wikimedia Spaces.

Strainu
>
> IRC on freenode is a good alternative but it lacks basic functionalities of
> a modern chat platform. So we created Wikimedia Chat, a mattermost instance
> in Wikimedia Cloud. Compared to IRC, you have:
> * Ability to scrollback and read messages when you were offline
> * Push notification and email notification
> * You don't need to get a cloak to hide your IP from others
> * Proper support for sharing media
> * Two factor authentication
> * A proper mobile app support
> * Ability to add custom emojis (yes, it's extremely important)
> * Profile pictures
> * Ability to ping everyone with @here
> * much much more.
>
> You can use Wikimedia Chat by going to https://chat.wmcloud.org, anyone can
> make an account. This is part of Wikimedia Social suite [1], the oher
> similar project is "Wikimedia Meet". [2]
>
> Some notes:
> * This is done in my volunteer capacity and has been maintained by a group
> of volunteers. If you're willing to join the team (either technical or
> enforcing CoC, kicking out spammers, other daily work), drop me a message.
> * Privacy policy of Wikimedia Cloud applies: https://w.wiki/aQW
> * As a result, all messages older than 90 days get automatically deleted.
> * As a Wikimedia Cloud project, all of discussions, private and public are
> covered by Code of conduct in technical spaces:  https://w.wiki/AK$
>
> Hope that would be useful for you, if you encounter any technical issues,
> file a bug in the phabricator.
>
> [1] https://meta.wikimedia.org/wiki/Wikimedia_Social_Suite
> [2] https://meta.wikimedia.org/wiki/Wikimedia_Meet
>
> Best
> --
> Amir (he/him)
> ___
> Wikimedia-l mailing list, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] Institutional memory @ WMF

2020-08-29 Thread Strainu
A few responses in random order:

> > OK, but how is this done precisely? Are there written docs? Mentors?
> > Is cross-team help common? Or is this kept at the anecdotal level ("oh
> > yeah, you should also keep in mind..." )?
> >
>
> In my experience, all of the above

That doesn't sound so good. For me, it means 2 things:
1. There is no uniform approach to onboarding re community collaboration.
2. Some teams choose to keep it anecdotal


> Perhaps we shouldn't expect this of an organization not ultimately
> accountable to the editors?  No amount of onboarding can change the
> Foundation's corporate Bylaws or the fact that it owns the trademarks
> whose value is based on editor labor.  Perhaps if we had a membership
> organization instead, which would have to report to the editors and
> justify its progress on initiatives directly voted on by its members...

I'm afraid that changing the "ownership model" wouldn't help much.
It's highly unlikely that the WMF, regardless of who they respond to,
will find employees with adequate experience and a willingness to work
for them only within the community. That means that they will still
need to address the onboarding part and implicitly the documentation
task.

> Our movement is complex, and there are no amount of explanations that will
> portray its richness. I will be working to make sure that new hires at the
> Foundation know to ask the right questions at the right time and to the
> right people to minimize errors. Of course, I want to set realistic
> expectations, this will not happen in a day, nor will it happen in a year
> only. My goal is to start a process that will change and evolve with time,
> as does our movement.

Delphine, it's great to hear that someone with a lot of community
experience is taking on this task. Obviously mistakes will never go
away completely, but I'm looking forward to seeing the results of your
work. I just hope you have some measure of success in mind, it would
be a pity to evaluate the program based on wikimedia-l feedback. :)

>
> If any of you have any questions about how we are working on this, or want
> to contribute ideas, please talk to me offlist!

I think documenting the process should be part of the process :) That
way it can be replicated or adapted by other organizations with
similar growth pains.

Strainu

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wiki Loves Monuments] WLM Webinar in Romania

2020-08-29 Thread Strainu
Hi,

This week there was a WLM-oriented webinar organized by the National
Heritage Institute in Romania. Here is the recording for those
interested (in English):
https://www.youtube.com/watch?app=desktop=tDWB5hPy8po

Strainu on behalf of CEllen :)

___
Wiki Loves Monuments mailing list
WikiLovesMonuments@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikilovesmonuments
http://www.wikilovesmonuments.org


Re: [Wikimedia-l] Institutional memory @ WMF

2020-08-26 Thread Strainu
În mie., 26 aug. 2020 la 13:07, Dan Garry (Deskana)  a scris:
>
> On Tue, 25 Aug 2020 at 22:26, Strainu  wrote:
>
> > The pattern I'm seeing is: team gets a big project (in this case UCoC)
> > -> team hires -> newbie makes good faith edits that are known to cause
> > offense to some members of the community.
>
>
> This is basically always going to happen when new people are onboarded, or,
> indeed, as people make mistakes. By my observations, this happens a lot
> less nowadays than it used to. This is anecdotal on my part, but in the
> absence of any rigorous study of the frequency with which this occurs, this
> thread as a whole is anecdotal. That's not to say it's not valuable to
> discuss it, but it's worth bearing that in mind.

Thanks for the response Dan!

A rigorous study is IMHO impossible, since we're lacking a rigorous
definition of the limits between WMF and community.
>
>
> > This pattern can be broken
> > only if the organization has a process to teach newcomers things that
> > seem obvious to old timers ("don't go over community decisions if you
> > can avoid it", "don't change content", "try to talk to people before
> > doing a major change", "not everyone speaks English", "affiliates are
> > not the community" etc.)
> >
> > My question is: does the WMF has such a process?
> >
>
> When people are onboarded a lot of this is explained to them, and people
> are encouraged to reach out to those more experienced with the communities.
> That people get it wrong occasionally is expected.

OK, but how is this done precisely? Are there written docs? Mentors?
Is cross-team help common? Or is this kept at the anecdotal level ("oh
yeah, you should also keep in mind..." )?

Strainu

>
> Dan
> ___
> Wikimedia-l mailing list, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] Institutional memory @ WMF

2020-08-25 Thread Strainu
În mie., 26 aug. 2020 la 00:03, Amir Sarabadani  a scris:
>
> Hey,
> Can you elaborate what happened? if It's public of course. It's hard to
> understand the problem without proper context.

The edits are public, but I don't really want to be specific, as that
would likely derail the discussion.

The pattern I'm seeing is: team gets a big project (in this case UCoC)
-> team hires -> newbie makes good faith edits that are known to cause
offense to some members of the community. This pattern can be broken
only if the organization has a process to teach newcomers things that
seem obvious to old timers ("don't go over community decisions if you
can avoid it", "don't change content", "try to talk to people before
doing a major change", "not everyone speaks English", "affiliates are
not the community" etc.)

My question is: does the WMF has such a process?

>
> Is it https://phabricator.wikimedia.org/T261133 ?
>
> On Tue, Aug 25, 2020 at 10:52 PM Strainu  wrote:
>
> > Hi,
> >
> > It seems the WMF is going through another crisis of institutional
> > memory, with the T team taking center stage. It's not really
> > important what they did wrong, it's minor compared with other faux-pas
> > they did in the past.
> >
> > I was wondering though if the organization as a whole has learned
> > anything from major crisis in the past and if there is a formal way of
> > passing to newcomers information such as when and how to contact
> > communities, what's the difference between a wiki, a community and an
> > affiliate etc.?
> >
> > Strainu
> >
> > ___
> > Wikimedia-l mailing list, guidelines at:
> > https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
> > https://meta.wikimedia.org/wiki/Wikimedia-l
> > New messages to: Wikimedia-l@lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>
> >
>
>
> --
> Amir (he/him)
> ___
> Wikimedia-l mailing list, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


[Wikimedia-l] Institutional memory @ WMF

2020-08-25 Thread Strainu
Hi,

It seems the WMF is going through another crisis of institutional
memory, with the T team taking center stage. It's not really
important what they did wrong, it's minor compared with other faux-pas
they did in the past.

I was wondering though if the organization as a whole has learned
anything from major crisis in the past and if there is a formal way of
passing to newcomers information such as when and how to contact
communities, what's the difference between a wiki, a community and an
affiliate etc.?

Strainu

___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
<mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>


Re: [Wikimedia-l] Let's discuss first features of Desktop Improvements coming to Vector

2020-08-25 Thread Strainu
Hey all,

I've asked some technicalities in the feedback page (and will continue
to do so as I test the new look more) but I was wondering about two
things that deserve to be discussed in a larger forum:
1. Why bundle the changes? The collapsible toolbar could be a useful
feature in itself, even for those who are firmly against the
narrow-down content area. It could also be the basis for more radical
changes, such as a "no distraction mode".
2. Why change Vector rather than creating a new skin or starting from
a 3-column skin such as Timeless? I assume it has more to do with
community dynamics than technical reasons...

Thanks,
   Strainu

În mar., 25 aug. 2020 la 20:06, effe iets anders
 a scris:
>
> I don't think the approach "we are going to see resistance anyway, so lets
> make it a bigger change" has proven to be terribly helpful in the past year
> or so.
>
> These layout changes are hard for sure, but there are definitely ways to
> bring people on board. The thing is, not every exciting change is
> necessarily going to help everyone to the same extent, and it's hard to
> convince a really diverse community. But there are a few tricks that we
> should definitely keep using, that are nothing new to the developing
> community: don't surprise (iterate and be public), try it out in a willing
> community (check) and try to remain backward compatible (how long have we
> supported the monobook skin now?).
>
> I actually feel that a constant change is more helpful, because it gives
> less of a 'now we have to fight to keep our ways' - it allows people to see
> that they will like some changes, and dislike some others, but on a
> balance, it'll improve for everyone. It's probably more time consuming
> because it requires more consultation too, but I think it's worth it.
>
> Lodewijk
>
> On Mon, Aug 24, 2020 at 5:00 AM Galder Gonzalez Larrañaga <
> galder...@hotmail.com> wrote:
>
> > Indeed! The FINAL stage of the changes is deeply conservative and not a
> > change at all. It's a small lifting, but not a real change. We are now 10
> > years old, and with the new changes we will be 8 years old in a year,
> > instead of being 11 years old.
> > 
> > From: Olga Vasileva 
> > Sent: Monday, August 24, 2020 1:53 PM
> > To: Galder Gonzalez Larrañaga 
> > Cc: Wikimedia Mailing List 
> > Subject: Re: [Wikimedia-l] Let's discuss first features of Desktop
> > Improvements coming to Vector
> >
> > Hi Vira, Ala'a, and Galder,
> >
> > Thanks for your feedback - we’re really glad you’re enjoying the changes
> > we’ve made so far.  I wanted to point out that this is not all! The
> > deployed changes are a part of a larger series of improvements that we will
> > be rolling out progressively over the next 1+ years. To see a list of the
> > other features we are planning on working on, please check out our project
> > page[1]. In addition, we believe that even after the project is complete,
> > there will still be work to do. We’d like to view this project as a new
> > baseline on which we can build new functionality that can improve both
> > reading and editing in the future.
> >
> > Thanks again!
> >
> > - Olga
> >
> > [1]
> > https://www.mediawiki.org/wiki/Reading/Web/Desktop_Improvements/Features
> >
> > On Sun, Aug 23, 2020 at 8:06 PM Galder Gonzalez Larrañaga <
> > galder...@hotmail.com<mailto:galder...@hotmail.com>> wrote:
> > Thanks for bringing this topic!
> > At euwiki it has been some weeks we have experienced the new vector style,
> > and it has some great things: you can be sure about how width images will
> > take for any reader, you can create better galleries or even decide where
> > to insert an image to avoid sandwiching.
> >
> > BUT...
> >
> > I think that the changes (even when finishing) will be too short on what
> > we need (a real face change!) but it will annoy in the same amount to those
> > who don't want any change at all. So, we are losing an opportunity to go on
> > with big changes.
> >
> > Best
> >
> > Galder
> > 
> > From: Wikimedia-l  > wikimedia-l-boun...@lists.wikimedia.org>> on behalf of Ala'a Najjar <
> > ala201...@hotmail.com<mailto:ala201...@hotmail.com>>
> > Sent: Saturday, August 22, 2020 10:06 PM
> > To: Wikimedia Mailing List  > wikimedia-l@lists.wikimedia.org>>
> > Cc: ovasil...@wikimedia.org<mailto:ovasil...@wikimedia.org> <
> > ovasil...@wikimedia.org<mailto:ovasil...@wikimedia.org>>
> > Subject: Re: [Wikimedia-l] Let's dis

Re: [pywikibot] Using listpages with redirects

2020-08-19 Thread Strainu
'#' is special in python, maybe the regex gets post-processed into becoming
a comment? Try escaping it.

Strainu

Pe marți, 18 august 2020, John Bray  a scris:

> I'd like to get all the redirect pages out of a wiki, but
>
> pwb.py listpages -start:UK -grep:'#REDIRECT' -format:"{page.title}" -get
>
> produces nothing, but
>
> pwb.py listpages -start:UK -grep:'United Kingdom' -format:"{page.title}"
> -get
>
> produces, as expected
>
> UK
> #REDIRECT [[United Kingdom]]
>
> pwb.py listpages -start:UK -grep:'#' -format:"{page.title}" -get
>
> picks up pages with a # in them, but not any of the #REDIRECTs
>
> why isn't -grep just parsing the page fully?
>
> John
>
> ___
> pywikibot mailing list
> pywikibot@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/pywikibot
>
___
pywikibot mailing list
pywikibot@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/pywikibot


[Wikitech-l] Change of translation for "attribution" in CC licenses

2020-07-20 Thread Strainu
Hi folks,

Sorry for cross-posting, not sure which list is the best venue for my problem.

I have an issue with regards to the translation of the word
"attribution" in "Creative Commons Attribution-Share-Alike". For
reasons (explained in [1]) which are not interesting for Wikimedia,
the CC-sanctioned Romanian translation has changed from "distribuire"
to "partajare" in the translation for version 4.0 *only*.

This becomes a problem for multilingual wikis (mw, m, c), which use
meta-templates and MediaWiki messages to translate the {{cc-by-sa-*}}
templates. What would be the easiest way to solve the problem without
affecting other languages?

Thanks,
   Strainu


[1] (in Romanian)
https://www.cyberculture.ro/2020/07/20/licente-creative-commons-versiunea-4-romana/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   5   6   7   8   9   >