Dan Andreescu dandree...@wikimedia.org wrote:
[...]
The point I'm trying to make is that the problems with Gerrit are not
problems with Gerrit, but actually problems with Git itself. If you can't
handle the basics of Gerrit, it's because you don't know how to use Git.
And at that point
On Sat, Mar 9, 2013 at 8:39 PM, Ryan Lane rlan...@gmail.com wrote:
On Sat, Mar 9, 2013 at 12:15 PM, Yuri Astrakhan yuriastrak...@gmail.com
wrote:
Should we re-start the lets migrate to github discussion?
No.
To be fair, though I understand the arguments against using github (though
On Sun, Mar 10, 2013 at 8:55 AM, Waldir Pimenta wal...@email.com wrote:
Are there strong reasons do dismiss it that weren't stated in that page?
Sorry, forgot the link:
http://www.mediawiki.org/wiki/Git/Gerrit_evaluation#GitLab
___
Wikitech-l mailing
On 10 March 2013 03:11, Rob Lanphier ro...@wikimedia.org wrote:
Hi folks,
Short version: This mail is fishing for feedback on proposed work on
Gerrit-Bugzilla integration to replace code review tags.
Interesting idea. I can imagine Bugzilla integration working fine for
filing fixmes - not so
The RSS extension is live on Foundationwiki and Mediawikiwiki. Are there
any security/performance reasons to not enabling it on a Wikipedia?
Currently I am sitting in a dewiki GLAM workshop [1] and the idea came
up to embed GLAM related blog postings [2] into our GLAM page [3].
Raimond.
[1]
On 03/10/2013 12:19 AM, Victor Vasiliev wrote:
After recent discussion on this list I realized that this has been
in discussion for as long as four years I went WTF and decided to
Just Go Ahead and Fix It. As a result, I made a patch to MediaWiki
which allows it to output recent changes feed
I created a bugzilla account but the process of going through the code and
fixing bugs seems cryptic. Any resources you can provide which can aid me
in understanding the process? A video which shows the process or some
documentation perhaps?
On Fri, Mar 1, 2013 at 5:06 AM, Quim Gil
While I understand that wikibase would be 'ideal' (allowing reads *and*
writes), I do not know if the WikiData people have the personell bandwidth
to do that. The separate extension outlined in this RFC would be a bit
hackish, but still faaar more efficient than the current solution of having
to
On 2013-03-10 1:20 AM, Victor Vasiliev vasi...@gmail.com wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hi everybody,
For long time it was acknowledged that our current way of serving the
recent changes feed to users (IRC with formatting using funny control
codes) is one of the
Hi all,
I'd like to announce a recently created tool that might help the Wikimedia
technical community find stuff more easily. Sometimes relevant information
is buried in IRC chat logs, messages in any of several mailing lists, pages
in mediawiki.org, commit messages, etc. This tool (essentially
On Sun, Mar 10, 2013 at 12:06 AM, Dan Andreescu dandree...@wikimedia.orgwrote:
I disagree and I have a very simple counter-point. Gerrit makes it
basically impossible to work in a git-flow style (
http://nvie.com/posts/a-successful-git-branching-model/
). From what I
understand, rebasing
On Sun, Mar 10, 2013 at 5:27 PM, Tejas Nikumbh tejasniku...@gmail.comwrote:
I created a bugzilla account but the process of going through the code and
fixing bugs seems cryptic. Any resources you can provide which can aid me
in understanding the process? A video which shows the process or some
On 2013-03-10 2:55 PM, Nischay Nahata nischay...@gmail.com wrote:
On Sun, Mar 10, 2013 at 5:27 PM, Tejas Nikumbh tejasniku...@gmail.com
wrote:
I created a bugzilla account but the process of going through the code
and
fixing bugs seems cryptic. Any resources you can provide which can aid
me
On Sunday, March 10, 2013 at 1:16 AM, Raimond Spekking wrote:
The RSS extension is live on Foundationwiki and Mediawikiwiki. Are there
any security/performance reasons to not enabling it on a Wikipedia?
Hi Raimond!
Extension:RSS recently underwent a major revision which (among other things)
On 10/03/13 15:50, Waldir Pimenta wrote:
The motivation for the tool came from a post by Niklas [1], specifically
the section Coping with the proliferation of tools within your community.
In the comments section, Nemo announced his initiative to create a custom
google search to fit at least
On 03/10/2013 10:50 AM, Waldir Pimenta wrote:
Hi all,
I'd like to announce a recently created tool that might help the Wikimedia
technical community find stuff more easily. Sometimes relevant information
is buried in IRC chat logs, messages in any of several mailing lists, pages
in
Hello,
I'm in the process of re-working mediawiki-vagrant, which is a set of scripts
for provisioning a virtual machine for MediaWiki development. I'm struggling to
identify the best way of fetching mediawiki/core.
An ideal solution would have the following attributes:
- Fast.
- Includes .git
Am 10.03.2013 21:48, schrieb Ori Livneh:
Raimond wrote:
The RSS extension is live on Foundationwiki and Mediawikiwiki. Are there
any security/performance reasons to not enabling it on a Wikipedia?
Extension:RSS recently underwent a major revision which (among other things)
changed .. to
Hi Ori, I use your vagrant VM all the time. Thanks!!!
If we have a good vagrant setup, getting new devs on-board might be that
much easier.
I keep all the notes of the things I needed to do to your VM at
http://www.mediawiki.org/wiki/User:Yurik/Installing_Linux_virtual_box_under_Windows_7
Most
On Sun, Mar 10, 2013 at 8:53 PM, Platonides platoni...@gmail.com wrote:
I'm not convinced about [[en:MediaWiki_talk:*]] and
[[en:Template_talk:*]], they can bring quite a bit of noise (similarly
for [[en:Wikipedia:Village_pump_(technical)]]). I see how interesting
discussions could be
Hi,
I've been thinking about this for the last week or so because it's becoming
incredibly clear to me that core isn't scaling. It's already taking up over
4GB on the Gerrit box, and this is the primary reason core operations are
slow.
A couple of things I can think of to help the situation.
-
The git-archive command is supposed to fulfill this functionality, but it
seems it does not work anonymously, i.e., over HTTPS. I think it only works
with ssh:// and with git://.
Otherwise, GitHub's snapshots seem to be the only solution.
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class
A shallow clone certainly shouldn't be as large as a normal one. Something's
borked.
On Sun, 10 Mar 2013 22:39:44 +0100, Chad innocentkil...@gmail.com wrote:
- We can repack core on manganese. This should provide a bit of relief,
but won't help long term. Core would have to be read-only for
On 10 March 2013 21:35, Waldir Pimenta wal...@email.com wrote:
Why log them publicly if we don't make them searchable? Either we're
committed to being open or we're not... having a public but hard-to-use
archive seems somewhat contradictory to me.
This is in fact the policy with mailing list
On 11/03/13 08:15, Ori Livneh wrote:
- Getting a snapshot from GitHub would probably work, but I am loathe to
depend on it.
We now depend on GitHub for ExtensionDistributor. I don't think it's
such a bad thing. External services can be a nasty trap when they have
the only copy of your data and
On 03/10/2013 06:30 AM, Kevin Israel wrote:
On 03/10/2013 12:19 AM, Victor Vasiliev wrote:
One thing you should consider is whether to escape non-ASCII
characters (characters above U+007F) or to encode them using UTF-8.
Whatever the JSON encoder we use does.
Python's json.dumps() escapes
On 03/10/2013 06:03 PM, Bartosz Dziewoński wrote:
A shallow clone certainly shouldn't be as large as a normal one.
Something's borked.
--depth 0 is what's broken. --depth 1 works fine.
$ git clone --depth 1
https://gerrit.wikimedia.org/r/p/mediawiki/core.git core-shallow
Cloning into
On 10/03/13 22:39, Chad wrote:
Hi,
I've been thinking about this for the last week or so because it's becoming
incredibly clear to me that core isn't scaling. It's already taking up over
4GB on the Gerrit box, and this is the primary reason core operations are
slow.
4GB??
My not specially
On 03/10/2013 05:33 PM, Yuri Astrakhan wrote:
* install phpmyadmin sqlite
PHPMyAdmin also has major security issues. It isn't allowed on
Wikimedia Labs and probably shouldn't be used here. Why does SQLite
need to be installed exactly?
* DEBUG == that's a big one, setting it up for easy
On 03/10/2013 06:43 PM, Platonides wrote:
On 10/03/13 22:39, Chad wrote:
Hi,
I've been thinking about this for the last week or so because it's becoming
incredibly clear to me that core isn't scaling. It's already taking up over
4GB on the Gerrit box, and this is the primary reason core
On 03/10/2013 06:27 PM, Victor Vasiliev wrote:
On 03/10/2013 06:30 AM, Kevin Israel wrote:
On 03/10/2013 12:19 AM, Victor Vasiliev wrote:
One thing you should consider is whether to escape non-ASCII
characters (characters above U+007F) or to encode them using UTF-8.
Whatever the JSON
On 03/10/2013 06:03 PM, Bartosz Dziewoński wrote:
-We can rewrite history (git-filter-branch) to remove some mistakes that
exploded the repo size. Binaries later removed, things accidentally
checked
into ./extensions, etc. This could potentially greatly reduce object
sizes
and allow for
Answered Inline. Also, I apologize as I think my email was slightly
off-topic to Ori's question.
On Sun, Mar 10, 2013 at 6:57 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:
PHPMyAdmin also has major security issues. It isn't allowed on
Wikimedia Labs and probably shouldn't be used here.
On Mon, 11 Mar 2013 00:11:59 +0100, Kevin Israel pleasest...@live.com wrote:
If Whatever the JSON encoder we use does means that one day, the
daemon starts sending UTF-8 encoded characters, it is quite possible
that existing clients will break because of previously unnoticed
encoding bugs. So I
I appreciate someone does something, but this should have been more
discussed. I would like to highlight that our goal should NOT be to do
this a way that is most simple for developers of mediawiki to
implement and most simple for devops to maintain and setup. Our goal
should be to make this feed
On Sat, 2013-03-09 at 17:11 -0800, Rob Lanphier wrote:
The solution that Chad and I discussed is an addition to the
Bugzilla-Gerrit plugin that Christian is already working on. The idea
would be that, for any given revision, there would be a file bug
about this revision link. Following that
On Sun, Mar 10, 2013 at 7:34 PM, Petr Bena benap...@gmail.com wrote:
I appreciate someone does something, but this should have been more
discussed. I would like to highlight that our goal should NOT be to do
this a way that is most simple for developers of mediawiki to
implement and most
If Whatever the JSON encoder we use does means that one day, the
daemon starts sending UTF-8 encoded characters, it is quite possible
that existing clients will break because of previously unnoticed
encoding bugs. So I would like to see some formal documentation of the
protocol.
Json
On 03/10/2013 07:17 PM, Yuri Astrakhan wrote:
Matthew, we are talking about a developer's virtual machine that has no
network connection to anything except the developer's machine itself, and
used purely for development.
As you said yourself, it's currently bridged. Doesn't that mean it is
in
(anonymous) wrote:
[...]
A couple of things I can think of to help the situation.
- We can repack core on manganese. This should provide a bit of relief,
but won't help long term. Core would have to be read-only for about an hour
or two.
[...]
Is this read-only to prevent the hiccup
On Sun, Mar 10, 2013 at 7:56 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:
As you said yourself, it's currently bridged. Doesn't that mean it is
in fact accessible to the whole LAN, until that's changed?
Of course - that's why I put the hostonly as the first requirement, and was
the
On 03/10/2013 07:34 PM, Petr Bena wrote:
I appreciate someone does something, but this should have been more
discussed.
Well, we can discuss this now. I don't like the discussions which end
up with here's a design of our superpony which should have those 9000+
features, and then we have no
My main concern with the program in its current state is the lack of
sufficient design. I mean, both the Configuration and MessageRouter objects
are glorified dictionaries (or defaultdicts), and global variables are used
for the router and config.
Also, the config protocol is almost definitely a
On Sun, Mar 10, 2013 at 12:55 AM, Waldir Pimenta wal...@email.com wrote:
On Sat, Mar 9, 2013 at 8:39 PM, Ryan Lane rlan...@gmail.com wrote:
On Sat, Mar 9, 2013 at 12:15 PM, Yuri Astrakhan yuriastrak...@gmail.com
wrote:
Should we re-start the lets migrate to github discussion?
On 03/10/2013 08:59 PM, Tyler Romeo wrote:
My main concern with the program in its current state is the lack of
sufficient design. I mean, both the Configuration and MessageRouter objects
are glorified dictionaries (or defaultdicts), and global variables are used
for the router and config.
I believe there is a recent regression in the math caching system. It
appears that newly added equations aren't being cached properly. The
images get generated on save, but it appears that Mediawiki is
forgetting that it has already made them and is attempting to
regenerate them on every save /
On 03/10/2013 07:53 PM, Brian Wolff wrote:
Json standard is pretty clear that any character can be escaped using \u
utf-16 code point or you can just have things be utf8. If clients break
because they can't handle that, that is the client's fault. Its not a hard
requirement.
Just a note, the
On Sun, Mar 10, 2013 at 10:38 PM, Victor Vasiliev vasi...@gmail.com wrote:
Finally, other than WebSocket and the socket interface, the one
other subscription method we should have it some sort of HTTP hook call,
i.e., it sends an HTTP request to the subscriber. This allows
event-driven
On 2013-03-11 12:26 AM, Tyler Romeo tylerro...@gmail.com wrote:
On Sun, Mar 10, 2013 at 10:38 PM, Victor Vasiliev vasi...@gmail.com
wrote:
Finally, other than WebSocket and the socket interface, the one
other subscription method we should have it some sort of HTTP hook
call,
i.e., it
On Sun, Mar 10, 2013 at 11:53 PM, Brian Wolff bawo...@gmail.com wrote:
*if you forget to unsubscribe we send you post requests until the end of
eternity.
Have it cut off if it receives an invalid HTTP response.
*dos vector - register someone you don't like's url. Register 100
variants
This is an improved version of the Weekly Report.
Thanks to Daniel Zahn and Nemo_bis for reviewing and feedback.
I've added some comments and explanations inline:
On Mon, 2013-03-11 at 03:00 +, reporter wrote:
Status changes this week
Reports changed/set to UNCONFIRMED: 49
51 matches
Mail list logo