Re: [Wikitech-l] Dump processes seem to be dead

2009-02-25 Thread Mark (Markie)
afaik there are hands in amsterdam that can be called upon to do stuff as
necessary in the centre like any other hosting customer, but the need is not
quite of the same level as tampa due to size, servers there etc.  seoul no
longer operates so this is not an issue.

regards

mark

On Tue, Feb 24, 2009 at 2:55 PM, Gerard Meijssen
gerard.meijs...@gmail.comwrote:

 Hoi,
 Is there also a Rob in Amsterdam and Seoul ?
 Thanks,
   GerardM

 2009/2/24 Aryeh Gregor
 simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com
 simetrical%2bwikil...@gmail.com simetrical%252bwikil...@gmail.com
 

  On Tue, Feb 24, 2009 at 9:42 AM, Thomas Dalton thomas.dal...@gmail.com
  wrote:
   Is there anyone within minutes of the servers at all times? Aren't
   they at a remote data centre?
 
  Isn't Rob on-site?
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Front-end performance optimization

2009-02-25 Thread Sergey Chernyshev
On Tue, Feb 24, 2009 at 7:31 PM, Aryeh Gregor
simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com
 wrote:

 On Tue, Feb 24, 2009 at 7:17 PM, Sergey Chernyshev
 sergey.chernys...@gmail.com wrote:
  How do we go about doing this? Can it be tied into Usability project (
  http://usability.wikimedia.org/)?

 It doesn't seem usability-related.


Actually it is very related to usability - performance is a very important
factor in usability and that is why Google, Yahoo and Amazon made research
how it affects usage and figured out a few very interesting numbers that for
them convert to hard cash:

I'm preparing a presentation at New York Web Standards Meetup about web
performance - it's not ready yet, but I came across a good presentation by
Nicole Sullivan from Yahoo! targeted at designers and UI experts -
http://www.techpresentations.org/Design_Fast_Websites

I understand why http://usability.wikimedia.org/ doesn't have front-end
performance as one of it's goals, but I think it should at least be
mentioned to all people working on redesign of such major site.


 If you have commit access, you could just start committing code (probably
 hidden behind
 disabled-by-default config variables to start with until it's tested
 and complete enough).  If you don't have commit access, you could ask for
 it.


Please don't give me this attitude - you need it, you do it.

I have commit access and is going to work on this, but my motivation with my
small projects is nothing compared to Wikipedia's and my setups are much
smaller and more controllable so I think I'll leave pitching for resources
for this idea to those who need it, but I'll be happy to talk about it in
person to Brion or anyone else who wants to do something about it.

BTW, are there any Wikipedia-related events in SF on the week of Web 2.0
conference (March 30 - April 3)? I'll be in SF for the conference and will
be happy to come by.

Thank you,

Sergey


If you meant could Wikimedia resources be allocated to this?, then
 Brion is the one to talk to.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Sergey Chernyshev
http://www.sergeychernyshev.com/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump processes seem to be dead

2009-02-25 Thread Marco Schuster
2009/2/25 John Doe phoenixoverr...@gmail.com:
 Id recommend either 10m or 10% of
 the database which ever is larger for new dumps to screen out a majority of
 the deletions. what are your thoughts on this process brion (and the rest of
 the tech team)?
Another idea: If $revision is
deleted/oversighted/whateverhowmadeinvisible, then find out the block
ID for the dump so that only this specific block needs to be
re-created in next dump run. Or, better: do not recreate the dump
block, but only remove the offending revision(s) from it. Shoulda save
a lot of dump preparation time, IMO.

Marco

-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschätsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Front-end performance optimization

2009-02-25 Thread Michael Dale
Sergey Chernyshev wrote:
 Yes, of course - I checked it out and that's why I quoted it in my original
 email.
 My brief overview made me feel that it wasn't enough.

 I just didn't want this to be only in context of localization as performance
 is more related to overall user experience then to multilingual support.

 I'll try to summarize what might be the goals of front-end performance
 optimizations.
   

hmm ... We can use the same script grouping/serving engine on all
javascript whether the script has localized msg or not. In a reply to
that thread I mentioned some details about path towards mediaWIki wide
deployment, it involved adding in wiki title support to the script
server, and grouping all the style requests, and modifying skins or the
header output script. Thanks for your summary of why front end
performance is important this will help prioritize that patch :)


 Yes, I took a look at the project and it seems that UI performance wasn't
 deemed important in this Usability initiative. This is not necessarily good,
 but I can understand why it concentrates on navigation, colors and editing
 UI.
   

The Usability initiative (like any software project) will have to take
performance issues into consideration. Having this script server in
place will benefit that effort as well.

--michael

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dump processes seem to be dead

2009-02-25 Thread Platonides
Marco Schuster wrote:
 Another idea: If $revision is
 deleted/oversighted/whateverhowmadeinvisible, then find out the block
 ID for the dump so that only this specific block needs to be
 re-created in next dump run. Or, better: do not recreate the dump
 block, but only remove the offending revision(s) from it. Shoulda save
 a lot of dump preparation time, IMO.
 
 Marco

That's already done. New dumps insert the content from the previous ones
(when available, enwiki has a hard time on it).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l