It's a shame I lost my momentum developing Gareth before this and dropped
back into the mundane stuff I have to get done for work.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
On Wed, 11 Jul 2012 08:25:00 -0700, Rob Lanphier
wrote:
Hi everyone,
As you know,
Le 10/07/12 21:44, Jeroen De Dauw a écrit :
> Hey,
>
> I got a unit test (added here: https://gerrit.wikimedia.org/r/#/c/14870/)
> causing some error which I can't figure out the cause of.
>
> The error is "Error: 1137 Can't reopen table: 'unittest_smw_ids'", full
> message here: http://dpaste.or
Le 11/07/12 15:20, Siebrand Mazeland a écrit :
> Could you please look into how that can be reduced?
It would need a force push. That requires a special permission in Gerrit
though since that can be used to rewrite the whole history.
--
Antoine "hashar" Musso
Le 11/07/12 17:25, Rob Lanphier a écrit :
> So, if you'd like to see us move off of Gerrit, now is your chance.
The same goes for those willing to stick to Gerrit :-D
--
Antoine "hashar" Musso
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.
Yes, that would be awesome. Let us talk, ping me or Lydia.
On Jul 11, 2012 3:22 PM, "Bináris" wrote:
> 2012/7/10 Denny Vrandečić
>
> >
> > * Step 2: deploy the Wikidata client extension on one language edition
> > of Wikipedia for testing (Q: How to select it?)
> >
> I suggest the Hungarian Wiki
2012/7/10 Denny Vrandečić
>
> * Step 2: deploy the Wikidata client extension on one language edition
> of Wikipedia for testing (Q: How to select it?)
>
I suggest the Hungarian Wikipedia as a medium-sized wiki with enthusiastic
contributors. You may find me and Tgr at Wikimania to speak about det
Hi,
Sorry for the late answer but I was not much on the internet during
the weekend (because I went to New York prior to going to DC for
Wikimania).
So I'm now at the Wikimania hackathon where I can talk with Niklas, my mentor.
I did all what I scheduled to do before the mid-term evaluation: the
>Don't underestimate how much readers love infoboxes.
>... Highly scannable info can be of great value to readers.
Agreed. I love Infoboxes and they are generally the first thing I read. I
will admit that the George Washington one is far too long though; at that
point it is no longer " Highly sc
On Wed, Jul 11, 2012 at 2:10 PM, Ryan Kaldari wrote:
> The problem with infoboxes is that they are inherently unencyclopedic.
> Infoboxes are for viewing data, not for giving a nuanced and comprehensive
> overview of a subject.
Don't underestimate how much readers love infoboxes. We did a mobile
Personally, I think the focus of this discussion on infoboxes is
short-sighted. My personal hope is that Wikidata will actually allow the
Wikipedias to use fewer infoboxes (and when they are used, for them to
be much smaller). This may sound counter-intuitive, but let me explain...
Right now,
Awesome Peter, good timing! See you soon :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> On 11 July 2012 14:16, Andre Klapper wrote:
> > KDE Bugzilla uses "RESOLVED UPSTREAM" for such cases,
> > Mer Project uses "RESOLVED TRIAGEDUPSTREAM" for such cases.
I need to correct myself:
Mer Bugzilla uses TRIAGEDUPSTREAM as a *state*, not as a resolution.
Also, MeeGo Bugzilla uses "WAITIN
Hi everyone,
As you know, when we moved to Git, we decided we would retire our home
grown "Code Review" extension for MediaWiki. Having collectively not
had a lot of experience with Git-based code review tools, we decided
to try Gerrit.
Moving to Git was a very deliberate decision that was discu
On Wed, Jul 11, 2012 at 10:51 AM, Brion Vibber wrote:
> This is pretty much the same as doing 'git clone' followed by 'git branch'
>
s/branch/checkout/
-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mai
On Wed, Jul 11, 2012 at 10:40 AM, Aran wrote:
> I'm just wondering how to clone an extension for a particular branch...
> e.g. using Subversion I could do this:
>
> svn co
>
> http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_18/extensions/CheckUser
>
> What's the equivalent git command to
Hi,
I'm just wondering how to clone an extension for a particular branch...
e.g. using Subversion I could do this:
svn co
http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_18/extensions/CheckUser
What's the equivalent git command to get that same version of the extension?
Thanks,
Aran
_
>I'm okay with state, but not at all with resolution code. From the
>reporters perspective the issue is not fixed when it is reported
>upstream. For example if our Gerrit is broken, the issue not resolved
>until a fix is applied in our installation.
+1
I think that the state is the best way to go
On 11 July 2012 14:16, Andre Klapper wrote:
> On Mon, 2012-07-02 at 09:23 -0700, Rob Lanphier wrote:
>> I've dabbled with the idea of turning "upstream" into a state,
>> actually, since the state is often inconsistent (sometimes it stays
>> open, sometimes its "resolved/later"), and it's not entir
On Wed, Jul 11, 2012 at 7:16 AM, Andre Klapper wrote:
> > Barring any changes like that, I'd prefer to keep the keyword, and ask
> > that the new Bug Wrangler help keep the upstream keyword up-to-date.
> > For issues that do have the keyword, it's handy shorthand that has
> > saved me some time pa
Hi mark
We're telling people who commit multiple commits to the same repo to squash
them before submitting. Yesterday I got around 550 emails from multiple commits
by you.
Could you please look into how that can be reduced?
Thanks.
--
Siebrand Mazeland
M: +31 6 50 69 1239
Skype: siebrand
Op
Replies inline.
On Jul 11, 2012, at 6:23 AM, jmccl...@hypergrove.com wrote:
>
> When you say "Whether some Wikipedia's output is
> semantically correct is important, but (afaik) has *zero* relationship
> with Wikidata. And as such is not relevant here" then I feel compelled
> to point out that a
On Mon, 2012-07-02 at 09:23 -0700, Rob Lanphier wrote:
> I've dabbled with the idea of turning "upstream" into a state,
> actually, since the state is often inconsistent (sometimes it stays
> open, sometimes its "resolved/later"), and it's not entirely clear
> what it should be. Another solution w
On Wed, Jul 11, 2012 at 06:32:21AM -0400, Denny Vrande??i?? wrote:
> Achim,
>
> MZ was trying to be helpful and constructive. The two of you, and many
...
Sorry, you were too fast -- or were I too slow? ;)
Cheers,
Achim
___
Wikitech-l mailing list
Wiki
Hi
I think I should give your, MZMcBride's, word's a 2nd chance not only to
cancel-down your (probably meant with good faith) comments, I excuse therefore.
My aim is the have on wikimedia/wikipedia (/wiki-whatever sounds apropriate)
1) a version-control environment (as we have for artcile-, talk-
Achim,
MZ was trying to be helpful and constructive. The two of you, and many
others here, have the same goal: providing more and more people with
more and more possibilites to collaboratively work at more and more
artefacts and type of artefacts that will be useful to a bigger and
bigger audience
Krinkle,
When you say "Whether some Wikipedia's output is
semantically correct is important, but (afaik) has *zero* relationship
with Wikidata. And as such is not relevant here" then I feel compelled
to point out that an ontology is most certainly envisioned -- wikidata
is implementing the SMW
On Wed, Jul 11, 2012 at 12:06 AM, wrote:
> No link to it in the footer of http://en.wikipedia.org/ etc. pages.
We should have a link to every project of ours and the subsequent
language variants as well in the footer?
___
Wikitech-l mailing list
Wikit
Hi
On Tue, Jul 10, 2012 at 09:27:16PM -0400, MZMcBride wrote:
> Achim Flammenkamp wrote:
> > Yes, me. It is fine to edit -- no problems with it. YOU seem to have a
> > seriously personal problem/prejudgment of this kind of text. Did you every
> > wonder why XML/SVG should be human readable?
>
> We
On Jul 10, 2012, at 10:29 PM, jmccl...@hypergrove.com wrote:
> In short it is either
>
> * no wikipedias
> can be considered part of the semantic web
>
> * or all wikipedias stand
> at the center of the semantic web
>
>
No. A conclusion like that seems to be conflicting with what wikidata
On 10/07/12 19:28, Daniel Kinzler wrote:
* We will split the squid cache by language, using a cookie that specifies the
user's language preference (for both logged in users and anons). The same URL is
used for all language versions of the page (this keeps purging simple). The
reasons are:
Uhh.
On 10/07/12 23:14, Daniel Kinzler wrote:
On 10.07.2012 13:58, Antoine Musso wrote:
Have you considered generating a PHP template and just cache that? Then
all hits would be served directly by the very simple PHP template?
Could you elaborate on that idea? I'm not sure I fully understand what y
On 10/07/12 19:28, Daniel Kinzler wrote:
* We will not use the parser cache for data pages (normal wikitext pages on the
wikidata site, e.g. talk pages, will use it). The reasons are:
a) memory for the parser cache is relatively scarce, multiplying the number of
cached objects by the number of la
On 10/07/12 23:58, MZMcBride wrote:
namespace with XML in the textarea is frankly just a terrible idea. XML is a
complete pain in the ass to edit by hand. Does anyone edit SVGs this way
unless forced to?
I do.
___
Wikitech-l mailing list
Wikitech-l@l
33 matches
Mail list logo