[Wikitech-l] GSOC 14: Proposal for providing support to new media file types (.x3d or collada) in commons

2014-03-20 Thread Umang Sharma
Hi all,
I am Umang Sharma from IIITH (International Institute of Information
Technology -  Hyderabad), India and am interested in working for one the
projects proposed by the community i.e. New media types supported in
Commons as a GSOC candidate. I have drafted a proposal for the same.

This project has been a long standing community request and it would be
great if I were given the opportunity to work on this and make some
progress. I have planned a basic outline on how to approach the problem. I
have decided to provide a solution for either x3d or collada file
formats(required for representing computer graphics). I will work on the
other if time is there during my project. However, I would like feedback on
which file format is more in demand currently. Also, if anyone has any
recommendations for efficient raster image generations do tell. Please go
through my proposal and tell me how can I improve it and make it up to the
expectations of the community.

Link : https://www.mediawiki.org/wiki/User:Umang13/Gsoc14

Regards,
Umang
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mapping our upstream projects

2014-03-20 Thread Quim Gil
Hi,

On Tuesday, March 18, 2014, Faidon Liambotis fai...@wikimedia.org wrote:

 I'm not sure what embed in our architecture or embed in our
 processes means, could you clarify that?


I have just edited the wiki page to be more precise:

Components
Projects that contribute to the architecture of MediaWiki, key extensions,
and applications. They are included in the downloadable packages of key
Wikimedia projects, or they are identified as required dependencies.

Tools
Projects that contribute to our development process. They are installed in
Wikimedia servers and they are used in the release process of key Wikimedia
projects.


I see for example that the page has a lot of the shiny stuff (e.g. Lua
 is there, but bash/PHP/Python/Ruby are not).


You could have said I'm impressed about the number of projects listed by
14 unique editors in just a few hours! ;)

PHP/Python/Ruby already appear multiple times in the Languages column,
showing their relevance to our project from this perspective. They can also
be listed as upstream projects, of course.

Moreover, a few random
 libraries are there but others that we take for granted are not (e.g.
 librsvg is there, but noone thought of gzip or bzip2; unihan vs. all of
 the dozens of fonts that we use, etc.). Not to mention all the hundreds
 of absolutely essential tools that we use for system maintenance that
 noone ever sees or cares about, from Linux to GNU sed, dpkg, apt etc.

 I think this needs to be clarified and/or scoped a bit better, including
 explaining the rationale  motivation behind doing all this work.


Good questions, helpful to edit that page further. Now it says


Motivation

This is a first step to improve our relations with other communities, to
increase the contributions received, and our influence in the projects that
matter to us.

Our mid term goals include:

* identify the projects where we want to see significant development, to
the point of sending patches as well
* identify the communities where Wikimedia should be regularly active and
heard
* identify the people in the Wikimedia and upstream communities that know
each other and act as bridge
* identify organizations and events we should get in touch and be part of
* get involved in bigger development efforts regularly, become a regular
FOSS player


For what it's worth, a uniqued dpkg -l across the production
 infrastructure shows 3276 software packages and personally I'd have a
 very hard time filtering the list based on what fits in the above
 description.


While we don't have an algorythm to calculate the relevance of an upstream
project yet, we do have an opinion about some components where our
attention is more required than others, even if both are essential, based
on how reliable they are, whether we miss features or not, whether the
upstream maintainers are responsive or not...

While we have Bug 51555 - librsvg seems unmaintained, dpkg is probably
doing fine without our specific attention. So far we are relying on human
memories and perhaps arcane resources to know which components matter and
who knows more. 3276 upstream packages means that we need a curated list
telling us a bit more about the communities maintaining the packages that
we would like to improve or see improved.

Sorry for not making these points clearer before. I hope it helps improving
that page further.

Thank you for the quick feedback pointing to the right directions (as usual
in Faidon).


-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSOC-2014 proposal for project A system for reviewing funding requests

2014-03-20 Thread Karan Dev
gryllida,

After having an overview on the details you provided I made changes in my
proposal, Please have a look and provide necessary feedback.

Link to project's topic:
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_reviewing_funding_requests
Link to my proposal:
https://www.mediawiki.org/wiki/User:Karan10/Reviewing_funding_requests

Thanks,

(Karan Dev)


On Wed, Mar 19, 2014 at 4:56 PM, Gryllida gryll...@fastmail.fm wrote:

 Again the usual detail I share on this...
 https://meta.wikimedia.org/wiki/Grants:IdeaLab/Application_scoring_system- a 
 formal idea page for it to evolve (idealab is for all ideas, gsoc too,
 not just grants from wmf)
 https://meta.wikimedia.org/wiki/User:Gryllida/sandbox - a spec (linked
 there)

 On Sat, 15 Mar 2014, at 22:34, Karan Dev wrote:
  hi,
  I made my proposal on project project A system for reviewing funding
  requests which I am willing to contribute.
  Here is my proposal link:
  https://www.mediawiki.org/wiki/Reviewing_funding_requests
 
  Link to my user page: https://www.mediawiki.org/wiki/User:Karan10
 
  Please review my proposal so that I can make it better.
 
  Thanks,
 
  (Karan Dev)
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC 2014: project proposal for a system for reviewing funding request.

2014-03-20 Thread Rahul Mishra

Hi,


Please ignore my earlier mail, i have done some editing in my draft and 
mailing it again.


I am Rahul Mishra,final year undergraduate and pursuing
my B-Tech form Netaji Subhash Engineering College having
majors Computer Sciences  Engineering.

I am very much interested in the project of A system for reviewing funding
requests and proposed a draft titled A system for reviewing funding 
requests.


Please review my draft and please give your valuable advise/suggestions, 
so that

i can further improve my proposal and make it better.

Link to my Userpage.
https://www.mediawiki.org/wiki/User:Rahulmishra22

Link to the project.
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_reviewing_funding_requests

Link to the Proposal.
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_reviewing_funding_requests 




Thank you,
Rahul Mishra.
Dept. of CSE,
NSEC.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC 2014: project proposal for a system for reviewing funding request.

2014-03-20 Thread Rahul Mishra

On 03/20/2014 01:36 PM, Rahul Mishra wrote:

Hi,


Please ignore my earlier mail, i have done some editing in my draft 
and mailing it again.


I am Rahul Mishra,final year undergraduate and pursuing
my B-Tech form Netaji Subhash Engineering College having
majors Computer Sciences  Engineering.

I am very much interested in the project of A system for reviewing 
funding
requests and proposed a draft titled A system for reviewing funding 
requests.


Please review my draft and please give your valuable 
advise/suggestions, so that

i can further improve my proposal and make it better.

Link to my Userpage.
https://www.mediawiki.org/wiki/User:Rahulmishra22

Link to the project.
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_reviewing_funding_requests 



Link to the Proposal.
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_reviewing_funding_requests 




Thank you,
Rahul Mishra.
Dept. of CSE,
NSEC.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Just an edit in above

Link to the proposal :
https://www.mediawiki.org/wiki/A_System_for_reviewing_Funding_Requests



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth upload

2014-03-20 Thread Brad Jorsch (Anomie)
On Wed, Mar 19, 2014 at 3:20 PM, Magnus Manske
magnusman...@googlemail.comwrote:

 Hi Brad,

 I'm sure that's correct, but:

 * When I just sign the OAuth params (no content type, no POST fields), I
 get The authorization headers in your request are not valid: Invalid
 signature
 * When I then add the content-type to the header, I get ... the API help
 page, wrapped in the XML tag  error code=help info=
 xml:space=preserve (even though I am querying JSON...)

 Apologies if this gets too detailed for the mailing list, it's just very
 frustrating :-(


It's wikitech, code talk should be expected ;)

I've taken the example code at 
https://tools.wmflabs.org/oauth-hello-world/index.php?action=download and
changed it to use a multipart/form-data POST; this new version is at 
https://tools.wmflabs.org/oauth-hello-world/multipart-formdata.php?action=download.
You'll notice only two changes to the code.

The first (and the important one) is that the call to sign_request() from
doApiQuery() only passes the OAuth header fields, not the post fields.

The second is that the call to curl_setopt for CURLOPT_POSTFIELDS from
doApiQuery() passes the $post array directly instead of first converting it
to a string with http_build_query(). This is what makes curl (or PHP's
implementation of it) use multipart/form-data rather than
application/x-www-form-urlencoded.

Besides some config and doc changes, everything else remains the same.

I hope this helps.

-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC - OPW Application

2014-03-20 Thread hiral parikh
Bug Report :https://bugzilla.wikimedia.org/show_bug.cgi?id=46653
My Wikipage : https://www.mediawiki.org/wiki/User:Parikh20

I am applying for GSoC 2014 and Outreach Program for Women with the
following Project:

Project Title:
Generic, efficient Localisation Update service
Project Link :

 *https://www.mediawiki.org/wiki/Extension:LocalisationUpdate/LUv2
https://www.mediawiki.org/wiki/Extension:LocalisationUpdate/LUv2*

Any Suggestion would be very helpful.

Thanks and Regards,
Hiral Parikh
B.E. (Pursuing)- Information Technology,
L. D. College of Engineering,
Ahmedabad.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC 2014] UniversalLanguageSelector fonts for Chinese wikis

2014-03-20 Thread xiangquan xiao
Hi,
As the deadline is comming, I come to publicize myself again :)
I have just updated the core part of my proposal [1]. Hope to see any
advice or comments. Thanks a lot!

[1]
https://www.mediawiki.org/wiki/User:Xiaoxiangquan/UniversalLanguageSelector_Fonts_for_Chinese_wikis

*-About Chinese Font Tailor*

In the ULS webfonts's repository( ULS:/data/fontrepo/fonts ), Autonym
contains all the characters needed for the ULS UI. It only needs tens of
Chinese characters, so only keep them, and kick the other tens of thousands
out. That's how Chinese Font Tailor will work! But it's for general
purpose, not only for UI.

*I think there are two approaches.*
- Aggressive update: Whenever a page is updated, scan it to see what
characters are used and then generate the font for it.
Advantage: quick response for visiting
Disadvantage: it seems to affect the external logic

- Lazy update: Whenever a page is visited and ULS is called, scan it to see
what characters are used and then generate the font for it. Cache the font
with a timestamp, then we can use it directly in future when finding it
up-to-date after comparing the font's and the page's timestamp.
Advantage: better cohesion
Disadvantage: may cause notable delay for the first visiting

I myself suggest the latter one, as wiki pages tend to Write Once and Read
Many.

*Implementation*
- A script that run on server will tailor the font file for a specified
page called ABC. The font file will be named ABC_FONT-NAME_TIMESTAMP.ttf.
- Modify the webfonts js (ULS:/resources/js/) to pack some parameters
needed, such as the page name
- A php script serve the right font for the webfonts' call.



Best regards,
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [GoSOC 2014] MediaWiki Project Proposal

2014-03-20 Thread Nikunj Gupta
Hi,

I am Nikunj Gupta https://www.mediawiki.org/wiki/User:Mecyborgfrom
The LNM Institute of Information Technology, India and I am interested in
working for the project catalogue for mediawiki extensionsas a GSoC 2014
candidate and I have drafted a proposal for the project.



There are currently about 2000 extensions available on
MediaWiki.orghttps://www.mediawiki.org/wiki/Extension_Matrix.
However, it is hard to identify and assess which extension fits a
particular need. Moreover, it is not clear which version of the extension
to take for a particular MediaWiki version. And we need to find the most
popular or most frequently downloaded extensions, we have to go to a third
party site like WikiApiary https://wikiapiary.com which currently handles
the extensions database.

There is a lot of scope for improvement and creative ideas. The current
plan is to implement a rating system on WikiApiary
http://wikiapiary.organd syndicate the data to
MediaWiki.org.



*Please provide feedback and suggestions to my proposal. Thanks.*



Bugzilla:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46704

Proposal Link:https://www.mediawiki.org/wiki/Mecyborg/GSoC2014

Talk page:  *https://www.mediawiki.org/wiki/Talk:Mecyborg/GSoC2014
https://www.mediawiki.org/wiki/Talk:Mecyborg/GSoC2014*

Profile:   https://www.mediawiki.org/wiki/User:Mecyborg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC 2014 proposal

2014-03-20 Thread Kunal Grover
Hi everyone,
I am interested in a GSoC 2014 project with the Wikimedia Foudation
The project aims at solving a few critical bugs in the Mediawiki Translate
extension and making it much more efficient and better.
These are the bugs included in the project. Please tell if you have any
suggestions :)
1-Having the page title optionally selected for translation. In a few
cases, the title mightn't be relevant and it shouldn't be translated.
2-This is a bug with core. Requirement to set the content language while
creating a page.
3-The translated page isn't updated when a translation unit page is moved
or deleted.
4-Redesign of interface of the extension on pages and the language bar.
5-Changes in Special:AggregateGroups-Currently, it is not possible to
change an aggregate group description or group name on
Special:AggregateGroups. Also, the page should have a read-only output for
users with insufficient permissions.

Please check-out the detailed explanation of these bugs and aimed
implementation.
I would really appreciate some feedback.

https://www.mediawiki.org/wiki/User:Kunalgrover05/GSoc_Proposal

Thank You.

Kunal Grover
II Year B.Tech
Department of Mechanical Engineering
IIT Madras, Chennai, India.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC 2014 proposal

2014-03-20 Thread Kunal Grover
Hi everyone,
I am interested in a GSoC 2014 project with the Wikimedia Foudation
The project aims at solving a few critical bugs in the Mediawiki Translate
extension and making it much more efficient and better.
These are the bugs included in the project. Please tell if you have any
suggestions :)
1-Having the page title optionally selected for translation. In a few
cases, the title mightn't be relevant and it shouldn't be translated.
2-This is a bug with core. Requirement to set the content language while
creating a page.
3-The translated page isn't updated when a translation unit page is moved
or deleted.
4-Redesign of interface of the extension on pages and the language bar.
5-Changes in Special:AggregateGroups-Currently, it is not possible to
change an aggregate group description or group name on
Special:AggregateGroups. Also, the page should have a read-only output for
users with insufficient permissions.

Please check-out the detailed explanation of these bugs and aimed
implementation.
I would really appreciate some feedback.

https://www.mediawiki.org/wiki/User:Kunalgrover05/GSoc_Proposal

Thank You.

Kunal Grover
II Year B.Tech
Department of Mechanical Engineering
IIT Madras, Chennai, India.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC Proposal System for reviewing funding requests

2014-03-20 Thread Prashant G
Hello Everyone,

I'm Prashant Gurumukhi, Nagpur India. I have made my proposal for the
project
A system for reviewing funding request. I would like you to review it and
please tell If I missed anything. Here's a link to my proposal:
https://www.mediawiki.org/wiki/User:ImPacific/GSoC_proposal_2014
User Page: https://www.mediawiki.org/wiki/User:ImPacific
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Sharing mathematical and music notations across wikis

2014-03-20 Thread Eugene Zelenko
Hi!

I think will be good idea to introduce support for files in TeX and
ABC/Lilypond (Score extension) formats, so such files could be hosted
on Commons.

This will simplify maintenance of formulas and music across projects
as well as allow to refer to mathematical and music notations from
Wikidata.

Eugene.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth upload

2014-03-20 Thread Magnus Manske
YES! THANK YOU!

The removal of http_build_query was the missing, secret ingredient. All is
well now in my OAuth world!

Thanks again,
Magnus


On Thu, Mar 20, 2014 at 2:05 PM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:

 On Wed, Mar 19, 2014 at 3:20 PM, Magnus Manske
 magnusman...@googlemail.comwrote:

  Hi Brad,
 
  I'm sure that's correct, but:
 
  * When I just sign the OAuth params (no content type, no POST fields), I
  get The authorization headers in your request are not valid: Invalid
  signature
  * When I then add the content-type to the header, I get ... the API help
  page, wrapped in the XML tag  error code=help info=
  xml:space=preserve (even though I am querying JSON...)
 
  Apologies if this gets too detailed for the mailing list, it's just very
  frustrating :-(
 

 It's wikitech, code talk should be expected ;)

 I've taken the example code at 
 https://tools.wmflabs.org/oauth-hello-world/index.php?action=download and
 changed it to use a multipart/form-data POST; this new version is at 

 https://tools.wmflabs.org/oauth-hello-world/multipart-formdata.php?action=download
 .
 You'll notice only two changes to the code.

 The first (and the important one) is that the call to sign_request() from
 doApiQuery() only passes the OAuth header fields, not the post fields.

 The second is that the call to curl_setopt for CURLOPT_POSTFIELDS from
 doApiQuery() passes the $post array directly instead of first converting it
 to a string with http_build_query(). This is what makes curl (or PHP's
 implementation of it) use multipart/form-data rather than
 application/x-www-form-urlencoded.

 Besides some config and doc changes, everything else remains the same.

 I hope this helps.

 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
undefined
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Separating skins from core MediaWiki (GSoC proposal)

2014-03-20 Thread Bartosz Dziewoński

I realize this has been discussed on this list rather recently (starting with 
Jon's CologneBlue question), and I realize some exploratory work has started 
(or at least was considered), but I'm submitting this anyway. I came up with 
the general idea first (really!), just haven't had time to write it down before.

Several people asked me to reply regarding the CologneBlue thread – consider 
this my response. :)

Comments (here or on the talk page) would be very welcome. I haven't gotten 
anyone to formally commit to mentoring this project yet, hopefully that can be 
sorted out on time.

https://www.mediawiki.org/wiki/User:Matma_Rex/Separating_skins_from_core_MediaWiki


Project synopsis:

MediaWiki core includes four skins, and allows site administrators to create and install 
additional ones. However, the process is less than pleasant, due to several related 
problems (lack of documentation, more than one correct way to make a skin 
work, directory layout that makes packaging and (un)installation difficult, core skins 
and MediaWiki itself being interdependent, and possibly others).

I intend to solve at least two of the aforementioned issues by devising and 
documenting a saner directory layout for skins (and applying it to the four core 
ones) and then carefully disentangling them from MediaWiki code, removing 
cross-dependencies and making it possible for non-core skins to have the same level 
of control over all aspects of the lookfeel as core ones currently have. This 
would make the lives of both skin creators and site administrators wishing to use a 
non-default skin a lot easier.

If everything goes well, the process would be culminated with moving the core 
skins out of core, to separate git repositories. This would require 
coordination with MediaWiki release managers (to have them shipped in the 
release tarballs the way certain extensions are shipped now) and Wikimedia 
Foundation Operations team members (to ensure the deployment of the new system 
on Wikimedia wikis goes smoothly), so it cannot be made a part of my core 
proposal.

--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [GSOC] A system for reviewing funding requests Details of project

2014-03-20 Thread Kushal Khandelwal
Hi Community

I am Kushal Khandelwal from India. Currently pursuing my undergraduate
degree from BITS Pilani KK Birla Goa Campus.

I am applying for the GSOC Project :A system for reviewing funding requests
Details of project :
https://meta.wikimedia.org/wiki/Grants:IdeaLab/Application_scoring_system

I have an initial drafr proposal ready at :
https://www.mediawiki.org/wiki/User:Kushal124/A_system_for_reviewing_funding_requests_GSOC

I kindly request the community to provide me feedback on my project.

My user page :  https://www.mediawiki.org/wiki/User:Kushal124

I also worked on initial microtask by providing a fix for
Bug/62464https://bugzilla.wikimedia.org/show_bug.cgi?id=62464

Thank you Bryan Davis(bd808) and mutante for helping me out with my first
bug fix to mediwaiki community.

Thanks and Regards
-- 

Kushal Khandelwal


Birla Institute of Technology  Science, Pilani

K K Birla Goa Campus
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC proposal: Wikimedia Identities Editor

2014-03-20 Thread Erick Guan
Hi,

I've publish a proposal[1] about Wikimedia Identities Editor. The mentors
will be Alvaro del Castillo, Daniel Izquierdo. Mediawiki Community Metrics
is a Wikimedia project which goal is to describe how the MediaWiki /
Wikimedia tech community is doing.
But it is lack of the management tools to allow the community member to
manage their profile and update their info on the Web interface. We should
also provide OAuth login as an option and provide locale settings.

Please help me review the proposal. Feedback is welcome.


[1]:https://www.mediawiki.org/wiki/User:Fantasticfears/GSoC_2014

-- 
Regards,

Erick Guan/管啸 (fantasticfears)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Why is Cologne Blue still in core?

2014-03-20 Thread Jon Robson
This is done. Please review! Thanks to Isarra for getting the new
repositories setup!

https://gerrit.wikimedia.org/r/118345
https://gerrit.wikimedia.org/r/119885
https://gerrit.wikimedia.org/r/119884



On Wed, Mar 12, 2014 at 2:13 PM, Jon Robson jdlrob...@gmail.com wrote:

 Help needed!

 Okay so I made a start to this. I'm probably not going to get time to
 work on this during this week, but if anyone wants to pick this up and
 accelerate it through to completion be my guest. I will love you
 forever. Otherwise I'll continue the work next week.

 https://gerrit.wikimedia.org/r/118345
 https://gerrit.wikimedia.org/r/118347

 On Wed, Mar 12, 2014 at 12:23 PM, Trevor Parscal tpars...@wikimedia.org
 wrote:
  I'm going to start working on some RL modifications to make it possible
 for
  skins outside of core to add skinStyles to other modules, which will help
  with making non-core skins equally capable.
 
  - Trevor
 
 
  On Wed, Mar 12, 2014 at 12:00 PM, Jon Robson jdlrob...@gmail.com
 wrote:
 
  Yes this is an orthogonal conversation. If it's that easy for a core
  change to break a skin outside core, then there are lots of
  fundamentally wrong things with our skin system, one being the fact
  that modules added with OutputPage get added to all skins even if they
  might not be compatible with them. If we want to talk about this I'd
  encourage you to start a new thread.
 
  On Wed, Mar 12, 2014 at 11:57 AM, Matthew Flaschen
  mflasc...@wikimedia.org wrote:
   On 03/11/2014 05:21 PM, Jon Robson wrote:
  
   If people forget they exist I would say that equates to no one cares
   about them and no one maintains them.
  
  
   Isarra was referring to skins outside of core.
  
   There's a difference between core developers forgetting non-core skins
   exist, and the developers of non-core skins forgetting.
  
   If we had a proper skin API, people wouldn't explicitly need to think
  about
   Foo non-core skin, but they would need to think about managing the
  changes
   to the skin API (the same way we manage other API changes).
  
   Matt Flaschen
  
  
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
  --
  Jon Robson
  * http://jonrobson.me.uk
  * https://www.facebook.com/jonrobson
  * @rakugojon
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Jon Robson
 * http://jonrobson.me.uk
 * https://www.facebook.com/jonrobson
 * @rakugojon




-- 
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC proposal Frontend for Vector skin CSS customizations (Ioannis Protonotarios)

2014-03-20 Thread Ιωάννης Πρωτονοτάριος
Hi all!
This is my proposal announcement for GSoC 2104:

Abstract

Appearance is most important for any site. Unfortunately, while MediaWiki is
very advanced in functionality it lacks flexibility in terms of appearance.
Vector skin is great, no doubt about it, but it seems there is no other
alternative. Almost all wikis in the world look like Wikipedia. Of course
one can apply their own CSS and customize the site's layout or even create
their own skin in PHP but both solutions demand a lot of knowledge and
effort and in the end nobody really does so except maybe for some big
commercial sites.

My proposal is the creation of a frontend that will help users with little
or no experience at all to easily produce all the CSS needed to change their
wiki's layout.

Full proposal wiki page:
https://www.mediawiki.org/wiki/User:Protnet/Frontend_for_Vector_skin_CSS_cus
tomizations

It's still a draft but you can get most of the idea. I've learnt about GSoC
a few days ago by chance so I didn't have much time to prepare it properly.
Any last minute help would be most appreciated! And the most important thing
is I still haven't found any possible mentors. Please help me on this!!

With regards,
Ioannis Protonotarios
Electrical  Computer Engineer MSc, now studying education
Athens, Greece

(P.S. I still have the fantasy that I will manage to apply with a second
proposal as well.)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l