I filed https://phabricator.wikimedia.org/T131930 to solve this.
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
So the user_properties replica will no longer contain anything that
isn't already public on the wikis?
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
Ah, I forgot to ask. Was /data/scratch/tmp deleted as part of this
spring cleaning? Some of the files it contained were more actively used
than others, I could have made a selective purge if requested. But then,
the directory name and permissions were what they were for a reason, so
nothing
Thanks for the notice; I've deleted a few (mostly) unused instances in
the pagemigration project.
I don't see what I can do on dumps-stats, which seems the biggest
offender. It did indeed have some big files a few weeks/months ago, but
they were already deleted a while ago. If a reboot is
FYI https://phabricator.wikimedia.org/T124169
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
See https://phabricator.wikimedia.org/T85984
The user_daily_contribs table (and associated API) is sometimes used for
* JavaScript (e.g. CentralNotice) targeting users based on activity in a
certain timeframe,
* simplification of SQL queries (e.g. [1]),
* other?
If you use this data/feature
Seemingly fixed in the meanwhile, by the way.
https://phabricator.wikimedia.org/T105585
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
Thanks for the massive update job! One of my instances, ttmserver-salt01
(i-0773.eqiad.wmflabs), was reportedly in reboot pending state: I
rebooted it manually. If that's related to the security updates, perhaps
it's worth checking whether reboots actually completed in all instances.
Nemo
Great! I've been eager to see this for many months. :) Particularly nice
that you manage to move instances without a reboot.
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
As Nuria, Billinghurst and others said, the tools are expected to be
discoverable. It's easy enough not to throw away the baby with the
bathwather*.
* Dynamic pages generally have some URL parameters, usually indicated by
?. In the general robots.txt, disallow Googlebot and friends** to crawl
Normal and expected, yes, but you *have* to check them
https://wikitech.wikimedia.org/wiki/Help:SSH_Fingerprints.
Can we please stop implicitly telling users that it's fine to ignore ssh
fingerprints? Thanks.
Nemo
___
Labs-l mailing list
client's instructions about replacing the server key
And check the fingerprints:
https://wikitech.wikimedia.org/wiki/Help:SSH_Fingerprints
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
So that's how many tasks are running in a given point in time, right?
Not how many tasks have run (to completion). The link was actually meant
to be http://tools.wmflabs.org/?status and shows a lot of httpd
instances which tend to accumulate (many are months old), the others are
about 2/3
Felipe Ortega, 14/02/2014 12:05:
Thanks a lot. Then, I look forward to the confirmation and
implementation of this feature. In case it's better to open a new issue
on bugzilla or any other action on my side (lend a hand with value
reviewing/testing) just let me know.
You could help assess the
Felipe Ortega, 13/02/2014 14:57:
My question is: are there any reasons for redacting this (apparently
public) info? I can't figure out why this could be sensitive data.
It's not redacted, it simply never existed. There aren't even log
entries for old registrations; on some wiki(s) the field
Messaggio originale
Oggetto: [Analytics] Metadata on deleted pages on labs
Data: Mon, 10 Feb 2014 14:56:10 -0800
Mittente: Dario Taraborelli
Rispondi-a: A mailing list for the Analytics Team at WMF and everybody
who has an interest in Wikipedia and analytics.
Quick heads
By the way, did anyone every check the package requests in Toolserver's
JIRA to find out what packages were/are needed by the tools, or at least
compare the differences in available packages?
A raw list of packages on nightshade but not tools-login (right) and
vice versa (left):
$ comm -3
Uh, sorry, some wrong lines.
accountsservice
acct
acl
acpi
acpi-support-base
adminbot
alpine
alpine-doc
anthy-common
antlr
ant-optional
apache2
apache2.2-bin
apache2.2-common
apparmor
apport
apport-symptoms
aptdaemon
apt-file
apticron
This is tracked at https://bugzilla.wikimedia.org/show_bug.cgi?id=54934
, I suggest to continue there.
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
https://bugzilla.wikimedia.org/show_bug.cgi?id=54143
Is this happening only on dumps?
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/labs-l
Thanks Luis for mentioning Bad Behaviour: all the information about it
on mediawiki.org was wrong (it was described as an AbuseFilter-like
thing) so I had never looked into it before. You're the first person
mentioning success with it, but it's possible we only hear from unhappy
MediaWiki
Please raise dumps project storage at 500 GB, ideally 1000: the datasets
handled here are often huge (think of the several TB of pageviews data)
and it's very inconvenient to split all tasks in several passages.
It's fine if /home partitions are reduced as compensation;
See
https://wikitech.wikimedia.org/w/index.php?title=Talk%3AMain_Pagediff=69007oldid=10547
It would also be nice to have a general requests page, or a village pump
of sorts, on the wiki.
Nemo
___
Labs-l mailing list
Labs-l@lists.wikimedia.org
Ryan Lane, 21/05/2013 22:27:
It's not that I'm opposed to it, but it's a massive waste of resources
to download from something in the network to a network fileserver, then
to upload it to archive.org http://archive.org.
Why is it necessary to write hundreds of GB to the fileserver before
they
24 matches
Mail list logo