Re: [Wikitech-l] Module storage is coming

2013-11-08 Thread Tyler Romeo
On Fri, Nov 8, 2013 at 9:45 AM, Antoine Musso  wrote:

> So what is a cache manifest? :D


tl;dr - Cache manifests are made for offline web apps, and Wikipedia is not
an offline web app.

Cache manifests are a new HTML5 feature that is specifically made for
single page (or, at the very least, few-paged) offline web apps. You add a
special attribute to the  tag of all pages in your application. The
value of the attribute is a URL to a manifest file (it has its own mime
type and everything). In this file it specifies what pages in your
application should be explicitly cached.

The difference between cache manifests and normal browser caching is that
the browser will never update the cache unless the manifest changes. In
other words, if it has an offline copy, it will always serve it unless the
manifest file changes.

This is useful in cases where you have a web app that is entirely
front-end, i.e., once you download the HTML files you don't need to do
anything else (think something along the lines of a single player game).
That way the files will be permanently cached and the user can view the
website even if the site itself is offline. Most apps in the Chrome Web
Store will use this technique to have their web app stored.

There are multiple reasons it is not used here:

1) Wikipedia is not a single-paged app, it is many, many pages, and every
page of the app usually links to the manifest. It would be strange to have
any Wikipedia article a user visits permanently stored in the user's
browser. (Before somebody says "well just don't put articles in the
manifest", any page that has the manifest attribute is implicitly cached,
regardless of if it's in the manifest.)

2) It doesn't solve the actual problem. The problem here is the issue of
combining all JS files into one. We combine all the files using RL in order
to reduce round-trip time for first-time visitors, but at the same time it
increases what has to be downloaded for previous visitors when updates are
made. Cache manifests do not get around the round-trip time issue, so it
doesn't allow us to split up JS files. And with the JS files still
combined, cache manifests don't have a way to partially update modules. So
in the end it is completely useless.

See the following links for more information:
https://en.wikipedia.org/wiki/The_cache_manifest_in_HTML5
http://www.html5rocks.com/en/tutorials/appcache/beginner/

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Module storage is coming

2013-11-08 Thread Tyler Romeo
On Fri, Nov 8, 2013 at 11:33 AM, Jon Robson  wrote:

> .. 3) it is a nightmare
> http://alistapart.com/article/application-cache-is-a-douchebag is a good
> read to anyone who is curious to the why.
>

I wouldn't go so far to say it is a "nightmare". The article you linked
blows things way out of proportion. In reality cache manifests are just one
of those "cool new features" that people like to use even if it's not a
proper solution for their application.

The ApplicationCache behaves in a fairly defined manner, as I explained
above. It's just an additional cache on top of normal HTTP caching that
permanently caches files based on a manifest. From that article, the only
true "gotcha" I would mention is #5, which explains that files not part of
the cache manifest will actually not be loaded, even if you're online. That
aspect is a little unintuitive, but once you know about it, it's not really
a problem. Even more amusing is the second part of the article that
attempts to use ApplicationCache for caching Wikipedia, which, like I just
said, is exactly *not* what ApplicationCache was meant for.

In the end I can understand the reason cache manifests exist: for
explicitly offline applications. If your application is not an offline
application, then you should not be using cache manifests in the first
place, because that's not what it's meant for.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-implementing PDF support

2013-11-13 Thread Tyler Romeo
On Wed, Nov 13, 2013 at 12:45 AM, Erik Moeller  wrote:

> Most likely, we'll end up using Parsoid's HTML5 output, transform it
> to add required bits like licensing info and prettify it, and then
> render it to PDF via phantomjs, but we're still looking at various
> rendering options.
>

I don't have anything against this, but what's the reasoning? You now have
to parse the wikitext into HTML5 and then parse the HTML5 into PDF. I'm
guessing you've found some library that automatically "prints" HTML5, which
would make sense since browsers do that already, but I'm just curious.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-implementing PDF support

2013-11-13 Thread Tyler Romeo
On Wed, Nov 13, 2013 at 11:16 AM, Brad Jorsch (Anomie) <
bjor...@wikimedia.org> wrote:

> Yes, phantomjs, as mentioned in the original message.
>
> To be more specific, phantomjs is basically WebKit without a GUI, so
> the output would be roughly equivalent to opening the page in Chrome
> or Safari and printing to a PDF. Future plans include using bookjs or
> the like to improve the rendering.
>

Aha awesome. Thanks for explaining.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Facebook Open Academy

2013-11-13 Thread Tyler Romeo
MediaWiki participates in a number of student competitions and programs as
an open source mentor (such as GSoC, Code-In, etc.). Today I ran into
another one: Facebook's Open Academy Program.

https://www.facebook.com/OpenAcademyProgram

I'm not sure how we would get involved in this program, but I'm sure people
would agree it might be a good thing to become a mentor organization and
have students contribute to MediaWiki as part of a college credit program.

Any thoughts?
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-14 Thread Tyler Romeo
On Mon, Nov 11, 2013 at 1:51 AM, Tim Starling wrote:

> My concern with this kind of maintainer model is that RFC review would
> tend to be narrower -- a consensus of members of a single WMF team
> rather than a consensus of all relevant experts.
>

I'd also like to point out that we can still implement a maintainer system
without changing the RFC process. There's no requirement that RFC review
and maintainers be the same people. This thread is mainly a discussion
about "Architects" (or the concept thereof), and how we might want to
change the MediaWiki code review structure.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-14 Thread Tyler Romeo
On Thu, Nov 14, 2013 at 6:12 PM, Lee Worden  wrote:

> Has anyone messed with this?  Any code I should crib from, or advice or
> cautionary tales?  Also, if it develops into something useful, I could
> split it out for others to use.


I have not messed with it personally, but I think it is a good idea. You
should also know that the HTML5 standard has standardized the Comet model
into server-sent events (SSE). [1] Mozilla also provides a nice tutorial on
how to use it. [2] However, one big catch is that this is not currently
implemented in Internet Explorer or mobile browsers. [3] So you'd have to
have your own custom pure-JavaScript implementation for IE support.

WebSocket, as another mentioned, is also an approach you could use.
However, WebSockets are meant for full duplex communication, meaning the
client is also talking back to the server, which may or may not be what you
want. Also using WebSockets means the internals of what is sent over the
socket and what it means is left to you to design, rather than being
standardized. Not to mention the fact that you have to implement WebSockets
in PHP or find a reliable library that will do it for you. And even then,
WebSockets are only supported in IE 10 and later, so you're still a bit
screwed in terms of backwards compatibility.

[1] http://www.w3.org/TR/eventsource/
[2]
https://developer.mozilla.org/en-US/docs/Server-sent_events/Using_server-sent_events
[3] http://caniuse.com/#feat=eventsource

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-14 Thread Tyler Romeo
On Thu, Nov 14, 2013 at 10:12 PM, Daniel Friesen  wrote:

> Basically in the end your best bet is probably to just use something
> other than PHP for this.
> The native implementations for both Socket.IO's and SockJS' servers are
> in Node.JS.
> So that's probably the best bet for doing this.
> You can communicate with something in PHP running MW stuff from the
> Node.JS server using another protocol.
> Such as STOMP, ZeroMQ, Thrift, D-Bus, etc... or even simply HTTP calls
> to the webserver/API or execution of PHP processes from the Node.JS server.
>

Agreed on this. PHP isn't meant for continuous processing, regardless of if
you use SSE or WebSockets. Also it should be noted that SockJS also has
Tornado and Twisted implementations, if you want to use Python. And, of
course, if your extension needs to be scalable, they also have vert.x,
which works with Java.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki performance analysis

2013-11-15 Thread Tyler Romeo
On Fri, Nov 15, 2013 at 3:20 AM, Yury Katkov  wrote:

> Just don't tell me that the 6th most popular website on Earth don't do
> any load testing! Maybe I don't understand the process and you test
> the software in a completely different way?
>

Isn't it always best just to go with the flow? ;)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-18 Thread Tyler Romeo
On Mon, Nov 18, 2013 at 11:21 AM, Risker  wrote:

> Perhaps more importantly, I don't see any actual argument for *not* using
> nofollow.  We're not here to drive pagerank for other websites, and our
> doing so can be harmful to those sites, or to the article subject.
>

Wikipedia's purpose may not be to drive PageRank, but nonetheless I think
the argument for using nofollow is pretty clear. Why would Wikipedia want
to purposely make search engine results less useful? The question here is
whether spammers are smart enough to get around us and boost their PageRank
artifically, which seems to be the case.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Facebook Open Academy

2013-11-19 Thread Tyler Romeo
On Tue, Nov 19, 2013 at 7:14 PM, Quim Gil  wrote:

> I am having a call with them on Friday morning PST and I will ask them
> these questions. Let me know if you have further questions or you want
> to join the chat.
>

I'm sure you've thought of most of these, but some obvious questions:

   - Would the students be using our existing code review infrastructure or
   is something Facebook specific required?
   - Would Facebook be the mentors for the program, with MediaWiki simply
   providing basic support? Or maybe each student would have a MediaWiki
   mentor as well as somebody from Facebook?
   - What is the syllabus for the class? What are the requirements for
   passing and getting college credit?
   - Do students work on multiple open source projects during the class? Or
   would they devote most of their effort to one project, and only contribute
   to others as an optional thing?
   - The document mentions teams. Are students put together in teams to
   work on stuff? If so how large are the teams, and how does this interact
   with the previous question?

This is literally everything I could think of, so if a question seems
inappropriate feel free to leave it out. Thanks in advance.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Solution for Third-Party Dependencies

2013-11-26 Thread Tyler Romeo
Hey everybody,

tl;dr - How do we add 3rd party libs to core: composer, git submodules, or
copying the code?

So I have a question to discuss concerning MW core that I was hoping to get
some feedback on: what is our policy on including third-party libraries in
core?

To clarify, by policy I don't mean what factors do we take into account
when deciding to include a library (although feel free to weigh in on that
if you want to say something), but rather how one would go about doing it.

Here are the possibilities:
1) Use Composer to install dependencies
2) Use git submodules to store a reference to the repository
3) Copy the code and add a note somewhere of where it came from
(If I am missing an option, please enlighten me.)

My opinion on the matter is that option 1 is probably the best, primarily
because Composer was designed specifically for this purpose, and it is
widely used and is unlikely to randomly disappear in the near future. Also,
it makes the incorporation of these libraries trivial, since the autoloader
will be automatically registered using Composer. However, the method is not
without fault. A recent patch to core actually removed our composer.json
file, in hopes of allowing MediaWiki sysadmins to make their own custom
composer.json file so that extensions could be installed that way. Which is
more important: better maintenance of core dependencies, or allowing easier
extension installation? I don't know; that's for us to decide. I'm a bit
conflicted on the matter because I really do want to make extension
installation and management easier, but at the same time making sure the
core itself is easy to use should probably be a higher priority.

The next option is pretty much similar to Composer in that you have a
reference to some external code that will be downloaded when told to do so
by the user. However, it's different from Composer in a number of ways: 1)
when packaging tarballs, the code has to be downloaded anyway since
submodules are git-specific, and 2) we have to manage the autoloader
manually. Not too bad. If we decide the Composer option is not viable, I
think this would be a good alternative.

I don't like the final option at all, but it seems to be our current
approach. It's basically the same thing as git submodules except rather
than having a clear reference to where the code came from and where we can
update it, we have to add a README or something explaining it.

Also, just to clarify, this is not an out-of-the-blue request for comment.
I am currently considering whether we might want to replace our
HttpFunctions file with the third-party Guzzle library, since the latter is
very stable, much much more functional, and a lot easier to use. However,
this is out-of-scope for the discussion, so if you have an opinion on
whether doing this is a good/bad idea, please start another thread.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Solution for Third-Party Dependencies

2013-11-26 Thread Tyler Romeo
On Tue, Nov 26, 2013 at 5:28 PM, Antoine Musso  wrote:

> You could start out a RFC to identify class that be replaced by a better
> third party libraries.  I don't mind.
>
> Symfony (a french PHP framework which is really spring for PHP) has a
> bunch of reusable components:
>
>  http://symfony.com/components
>
> Among them:
>   Console : could a bunch of our Maintenance class
>   HttpFoundation : what you said, HTTP on rails
>   Routing : do we have a router?
>

Done.
https://www.mediawiki.org/wiki/Requests_for_comment/Third-party_components

I'll continue on the search for more third-party components, but right now
those listed seem like the main candidates (although even some of those
listed would be really difficult to do and might not happen).

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Structured logging

2013-12-03 Thread Tyler Romeo
The RFC on third-party components may interest you:
https://www.mediawiki.org/wiki/Requests_for_comment/Third-party_components

If we use the Monolog library, which is used in Symfony and others, we can
avoid having to re-implement an entire logging framework.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Tue, Dec 3, 2013 at 8:32 PM, Bryan Davis  wrote:

> I have just posted a new draft RFC to alter the wfErrorLog() family of
> functions to make logged events carry more data and have a common
> output format [0].
>
> This RFC was started as a part of the current DevOps related work [1]
> being done by the Platform Core team. Ori, Aaron and I would like to
> get feedback from the community on this early stage proposal before
> continuing to work towards a prototype implementation.
>
> We feel that implementing a common and information rich log message
> output standard will be a helpful step towards removing some of the
> mystery from the process of monitoring an active wiki. The current
> variety of log outputs slows down the process of creating log analysis
> tools. The information provided by a given log message is also largely
> in the hands of each developer and varies widely not only from
> component to component but sometimes from commit to commit. The final
> proposal will describe a common output standard that provides a means
> to attach common data to all log messages while still allowing each
> module to add unique data that is needed to diagnose problems at
> runtime and at scale.
>
> [0]:
> https://www.mediawiki.org/wiki/Requests_for_comment/Structured_logging
> [1]: https://www.mediawiki.org/wiki/DevOps_Sprint_2013
>
> Bryan
> --
> Bryan Davis  Wikimedia Foundation
> [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID
> irc: bd808v:415.839.6885 x6855
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.22.0rc3 ready for download

2013-12-05 Thread Tyler Romeo
On Tue, Dec 3, 2013 at 4:39 PM, John  wrote:

> VE is a BAD idea, its full of holes and bugs


Those are two separate concepts. Just because something has bugs does not
make the software itself a bad idea.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.22.0rc3 ready for download

2013-12-06 Thread Tyler Romeo
On Fri, Dec 6, 2013 at 3:23 AM, Martijn Hoekstra
wrote:

> In my estimation David is pretty capable of making an informed decision on
> the merits of deploying VE to his wikis.
>

I wouldn't make such assumptions without context. Deploying beta software
in a security sensitive environment is never a good idea. It all depends on
what type of wikis are being run and what is being stored on them.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-08 Thread Tyler Romeo
On Sat, Dec 7, 2013 at 8:09 AM, Jeroen De Dauw wrote:

> Present situation:
>
> * By default nothing is installed
> * Users can choose whatever they want to add, _including_ PHPUnit
>

This is forgetting the point that is the topic of this thread: you cannot
currently add dependencies for the MediaWiki core itself because there is
no composer.json file to add them to. If I wanted to add a third party
library as a dependency to MediaWiki core, how would I do it? (Keep in mind
I'm not talking in the scope of my own installation, I mean adding a
dependency to core itself through Gerrit.)

Right now the only somewhat proper approach to adding new core dependencies
is using Git submodules, which is not the most favorable approach for
reasons I mentioned in the beginning of the thread.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-08 Thread Tyler Romeo
On Sun, Dec 8, 2013 at 6:06 PM, Jeroen De Dauw wrote:

> If the core community actually gets to a point where potential usage of
> third party libraries via Composer is actually taken seriously, this will
> indeed need to be tackled. I do not think we are quite there yet. For
> instance, if we go down this road, getting a clone of MW will no longer be
> sufficient to have your wiki run, as some libraries will first need to be
> obtained. This forces change to very basic workflow. People who dislike
> Composer will thus scream murder. Hence I tend to regard this as a moot
> point for now. Though lets pretend this has already bee taken care off and
> look at solutions to the technical problem.
>

OK, in that case I will use git submodules for my upcoming third party
library patches.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-08 Thread Tyler Romeo
On Sun, Dec 8, 2013 at 10:43 PM, Nik Everett  wrote:

> Not to be a downer but isn't this one of the things x windows did that
> upset people? I'm not arguing that this approach is doomed, just that care
> must be taken.  Honestly I don't know the situation well enough to have a
> super strong opinion.


Agreed on being very hesitant with this. Not *all* of MediaWiki needs to be
a library. A library implies re-use. For example, nobody is going to reuse
the SpecialPage class and its children outside of MediaWiki, so it does not
make sense to turn it into a library.

The only things that could really be separated into libraries are the
DatabaseType hierarchy, the HTMLForm section, maybe the FileRepo / File /
FileBackend product, and maybe the JobQueue product.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Module storage is coming

2013-12-08 Thread Tyler Romeo
I'm sure this has already been taken into consideration, but keep in mind
that code that is executed using eval() in Javascript is *not* optimized by
the V8 compiler like normal script resources are.

Considering our scripts do not perform much intensive work AFAIK, it should
not be an issue, but any module that is cached in localStorage will no
longer benefit from V8's compiler optimizations. It may be useful to have
an option that disables the use of localStorage for specific modules.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Wed, Dec 4, 2013 at 1:02 AM, Roan Kattouw  wrote:

> On Tue, Dec 3, 2013 at 12:30 AM, Ori Livneh  wrote:
> > We'll gradually enable module storage on all Wikimedia wikis over the
> > course of the next week or two.
> Ori deployed this to the live site earlier today :) . For reference,
> the original post about module storage is archived at [1].
>
> I tweeted an impressive graph from Ganglia [2] that Ori shared on IRC
> a little while after the deployment, and consequently my phone is now
> blowing up as lots of people are retweeting it and replying to it.
> Turns out lots of people are interested in Wikipedia :)
>
> However, while the graph was impressive, there were some caveats:
> * It was of internal traffic into the Apache backends serving bits,
> not the Varnish caches
> * The Varnish traffic (that actually get all the traffic) dropped
> suddenly but not very sharply, and the drop was insignificant compared
> to time-of-day and day-of-week variance
> * The drop occurred because ResourceLoaderLanguageDataModule had a bug
> in its mtime computation, causing it to recache all the time; module
> storage greatly dampened the impact of that bug. The bug was
> identified later that day and fixed by Krinkle [3], in the commit with
> probably the highest commit-message-to-diff ratio of all time.
>
> Although it wasn't "really" responsible for the huge drop we saw in
> the graphs, make no mistake, this is awesome. Thanks Ori, for working
> on this and putting up with my code review slowness and nitpicking :)
>
> One interesting response I got on Twitter [4] said we should avoid
> localStorage in favor of "indexedDB and modern async APIs". I suppose
> we could look into that :)
>
> Roan
>
> [1]
> http://lists.wikimedia.org/pipermail/wikitech-l/2013-November/072839.html
> [2] https://twitter.com/catrope/status/408018210529615872
> [3] https://gerrit.wikimedia.org/r/#/c/99010/
> [4] https://twitter.com/andreasgal/status/408108587320623104
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Module storage is coming

2013-12-09 Thread Tyler Romeo
On Mon, Dec 9, 2013 at 2:15 PM, Gabriel Wicke  wrote:

> Are you sure this is still the case?
> https://code.google.com/p/v8/issues/detail?id=2315 seems to suggest that
> this was fixed in V8 last year.
>

Not sure if it's related, but looking at the bleeding edge compiler.cc code
directly, it explicitly turns off optimizations inside the eval compiler.
[0]

On Mon, Dec 9, 2013 at 5:08 PM, Daniel Friesen 
 wrote:

> Also are we even using eval anyways?
>
> The module storage stores whole scripts, it should be able to function
> by inserting inline // downloaded text elements.
>

It depends. Right now we use $.globalEval(), [1] which will make a 

Re: [Wikitech-l] OAuth Devlopment Training

2013-12-11 Thread Tyler Romeo
I'll probably try and attend, although it's during the day so there's no
guarantee my boss won't randomly schedule a meeting or something.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Tue, Dec 10, 2013 at 11:43 PM, Aaron Halfaker wrote:

> I'm bummed that I won't be able to join in since this overlaps
> substantially with the Analytics Research & Data showcase that starts @
> 11:30 AM PST.  Would you be interested in recording the presentation for
> those of us who cannot attend?
>
> -Aaron
>
>
> On Tue, Dec 10, 2013 at 6:47 PM, Chris Steipp 
> wrote:
>
> > Hi all,
> >
> > For any developers who have been thinking about connecting their
> > application to MediaWiki, but haven't gotten around to diving in, I'm
> going
> > to have a short training/workshop session next week. I'll give a brief
> > intro to using the version of OAuth that we're running, and walk through
> > some quick demos in php and go. After that, I'm happy to walk any
> developer
> > through getting their app connected, if anyone is struggling with a
> > particular issue.
> >
> > It will be Wed, Dec 18th at 11am PST (1900 UTC). Please let me know if
> > you're interested. We'll probably use a hangout for the session, but if
> > that's not an option for anyone we can use a voice call and etherpad.
> > Either way I'll probably send out invites individually.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-evaluating MediaWiki ResourceLoader's debug mode

2013-12-11 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 11:33 AM, Max Semenik  wrote:

> If they look at the URL it will be pretty obvious because all of them
> have debug=false as first parameter.
>

As a proof of concept, this is how I found out about the debug parameter
the first time I tried doing Javascript debugging in MediaWiki.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 2:04 PM, Jon Robson  wrote:

> Many a time I've talked about this I've hit the argument that gerrit is
> confusing to some users and is a barrier for development, but this is a
> terrible unacceptable attitude to have in my opinion. Our end users deserve
> a certain standard of code. I'm aware using a code review process can slow
> things down but I feel this is really essential. I for one greatly benefit
> from having every single piece of my code scrutinized and perfected before
> being consumed by a wider audience. If this is seen as a barrier, someone
> should investigate making it possible to send wiki edits to Gerrit to
> simplify that process.
>

I can definitely understand the reasoning behind this. Right now with both
Gadgets and common.js we are allowing non-reviewed code to be injected
directly into every page. While there is a bit of trust to be had
considering only administrators can edit those pages, it is still a
security risk, and an unnecessary one at that.

I like the idea of having gadgets (and any JS code for that matter) going
through Gerrit for code review. The one issue is the question of where
would Gadget code go? Would each gadget have its own code repository? Maybe
we'd have just one repository for all gadgets as well as common.js
(something like operations/common.js)? I don't think sending wiki edits to
Gerrit is too feasible a solution, so if this were implemented it'd have to
be entirely Gerrit-based.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 3:19 PM, Brian Wolff  wrote:

> One of the primary reasons gadgets/local-js exist is because local
> wiki-admins feel that the mediawiki code review process is unavailable
> to them. I would expect any sort of code review requirement for
> gadgets to meet strong resistance, especially on the smaller wikis.
>
> I also think it would be unenforcable unless one plans to ban personal
>

In this case we should promptly work to fix this issue. To be honest, the
only difficult part of our code review process is having to learn Git if
you do not already know how to use it. If there were a way to submit
patchsets without using Git somehow (maybe some kind of desktop
application), it may make things easier.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mailing list etiquette and trolling

2013-12-11 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 5:11 PM, Jeroen De Dauw wrote:

> (I'm now half expecting someone to claim this mail is a troll. Perhaps we
> ought to make a contest out of making the accusation first, at least then
> it will have general amusement value :D)
>

This contest idea sounds exciting. ;)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia Commons Video Uploads waiting (Bug #58155)

2013-12-11 Thread Tyler Romeo
:/ There are only ten videos. Is there some sort of special upload process
that needs to be followed here? Because uploading ten things to Commons
takes all of fifteen minutes.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Wed, Dec 11, 2013 at 5:53 PM, Brian Wolff  wrote:

> On 12/11/13, Antoine Musso  wrote:
> > Le 11/12/13 19:15, Manuel Schneider a écrit :
> >> Hi,
> >>
> >> I have a bunch of videos from our last conference waiting for an upload
> >> to Commons. For this I have filed a bug several days ago:
> >> * https://bugzilla.wikimedia.org/show_bug.cgi?id=58155
> >>
> >> Can someone please take care of this in a timely manner? The conference
> >> is now three weeks ago and soon nobody will be interested in the videos
> >> anymore if we don't provide them near enough to the event.
> >
> > Hello,
> >
> > I have no idea which script should be used to upload those files. If
> > anyone has a link to the step-by-step guide to achieve it, I will be
> > more than happy to execute the commands and proceed with the uploads.
> >
> > cheers,
> >
> > --
> > Antoine "hashar" Musso
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Normally importImages.php is used I believe. I think you create a
> directory, use curl/wget to get all the files, then run
> importImages.php, making sure to specify the --comment-ext and --user
> option.
>
> --bawolff
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 4:13 PM, Brian Wolff  wrote:

> I guess I should have said without banning [[MediaWiki:Common.js]]. I
> was kind of assuming this proposal meant banning all site wide js
> (Since otherwise what's the point of banning default on gadgets?
> Default on gadgets is just a way to separate common.js into modules
> for easier maintainability)
>

Yeah I think it's pretty much established that globally banning Gadgets is
simply not going to work unless you also ban common.js. If anything it
makes the problem worse, since at least Gadgets allow some organization for
the chaos (and users can turn off gadgets).

Submitting patches is not the problem. Getting them reviewed in a
> timely fashion is a problem. Having power taken out of local wikis
> hands is a problem.
>

I'll be frank. I don't really care. If power is really the issue, then let
people from the wikis have +2 on their wiki's Gerrit project. If the cost
of increasing site-wide security and alleviating developers' pain is a few
upset users who will get over it in a month or two, then so be it.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 7:15 PM, Chad  wrote:

> I'm going to say this one final time, since I'm feeling like a broken
> record today...
>
> We are not going to use Gerrit for gadgets and so forth. It is the
> *wrong* tool for the job. Full stop.
>

Gerrit is a code review tool. Gadgets are code. It is the absolute
*correct* tool for the job. At the very least it is a more proper tool than
using wiki software. If Wikipedia wants to have any resemblance of proper
software security, having gadgets stored on wiki should disappear very
quickly.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia Commons Video Uploads waiting (Bug #58155)

2013-12-12 Thread Tyler Romeo
On Wed, Dec 11, 2013 at 7:26 PM, Brian Wolff  wrote:

> The videos are (mostly) over 1 gb, which is our upload limit. Hence
> maintenance script needed.
>

Ah. Thanks for the clarification.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. (upgrade, if you can)

2013-12-16 Thread Tyler Romeo
On Mon, Dec 16, 2013 at 7:08 AM, Arcane 21  wrote:

> I've tried PHP 5.5.0, had to go back to PHP 4.4.9 due to issues with
> deleting and moving pages in MediaWiki.


Is there a bug filed for this. If not please do so. MediaWiki should be
compatible with newer PHP versions.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP new version(s) 5.5.7 and other versions. (upgrade, if you can)

2013-12-16 Thread Tyler Romeo
On Mon, Dec 16, 2013 at 6:54 PM, Arcane 21  wrote:

> More details can be found on the bug report I submitted on Bugzilla.


For those wondering, it is filed under bug 58532.

https://bugzilla.wikimedia.org/show_bug.cgi?id=58532

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC cluster summary: HTML templating

2013-12-27 Thread Tyler Romeo
It sounds pretty much like the two templating RFCs are asking for the same
exact thing, except just using different templating engines as the
solution. If we want a comprehensive templating system in PHP core,
Mustache is not going to be enough (I've used it before, and dealing with
the lack of control structures is frustrating unless the templates are
really simple). So I'd support using Twig or something similar to it like
jinja2. Either way the RFCs can definitely be combined.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Thu, Dec 26, 2013 at 9:09 PM, Jon Robson  wrote:

> +1 to creating a new RFC that satisfies the needs of all 3. Although not
> mentioned Kaldari's proposal highlights a big existing need in
> MobileFrontend. Here we use JavaScript templates extensively and are now
> finding situations where we have the same HTML being generated on the
> server and client (for example after an edit the entire page is re rendered
> on the client).
>
> I'm very excited to discuss this and push these RFCs along. Please feel
> free to bring me into any conversations.
> On 26 Dec 2013 18:04, "Rob Lanphier"  wrote:
>
> > Hi folks,
> >
> > This is an attempt to summarize the list of RFCs that are listed under
> > this cluster:
> >
> >
> https://www.mediawiki.org/wiki/Architecture_Summit_2014/RFC_Clusters#HTML_templating
> >
> > ...and possibly get a conversation going about all of this in advance
> > of the Architecture Summit.
> >
> > The main focus of all of these RFCs is around HTML generation for user
> > interface elements.  This category is not about wikitext templates or
> > anything to do with how we translate wikitext markup
> >
> > "Template Engine" is Chris Steipp's submission outlining the use of
> > Twig.  From my conversations with Chris, it's not so much that he's
> > eager to adopt Twig specifically so much as standardize on
> > *something*, and make sure there's usage guidelines around it to avoid
> > common mistakes.  He's seen many attempts at per-extension template
> > libraries that bloat our code and often start off with big security
> > vulnerabilities.  There are many extensions that use Twig, and it
> > seems to be a popular choice for new work, so Chris is hoping to
> > standardize on it and put some usage standards around it.
> >
> > "HTML templating library" is Ryan Kaldari's submission, promoting the
> > use of Mustache or something like it.  His main motivation is to have
> > a Javascript template library for front-end work, but is hoping we
> > choose something that has both Javascript and PHP implementations so
> > that any PHP template system implementation is compatible with what we
> > use for Javascript templating.
> >
> > "MVC Framework" is Owen Davis's description of Wikia's Nirvana
> > framework, which has been central to all of the user-facing work
> > they've been doing for the past 2-3 years.  As Owen points out in the
> > talk page for this, it's really view-controller rather than full MVC.
> > A big plus of adopting this RFC is that it would make it much more
> > likely that Wikia-developed extensions (of which there are many) would
> > be of greater use to the larger MediaWiki community, and would
> > generally help facilitate greater collaboration between Wikia and the
> > rest of us.
> >
> > "OutputPage Refactor" is another submission from Owen Davis, which
> > isn't really about templating, so much as taking the HTML generation
> > code out of OutputPage.  Wikia has been maintaining a fork of
> > OutputPage for quite some time, so they already have quite a bit of
> > experience with the proposed changes.  This is clustered with the
> > templating proposals, since I imagine the work that gets factored out
> > of OutputPage would need to be factored into whatever templating
> > system we choose.
> >
> > The first three seem somewhat mutually exclusive, though it's clear
> > our task is likely to come up with a fourth proposal that incorporates
> > many requirements of those three.  The OutputPage Refactor proposal,
> > given some fleshing out, may not be controversial at all.
> >
> > Where should we go from here?  Can we make some substantial progress
> > on moving one or more of these RFCs over the next few weeks?
> >
> > Rob
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC cluster summary: HTML templating

2013-12-27 Thread Tyler Romeo
On Fri, Dec 27, 2013 at 11:58 AM, Ryan Kaldari wrote:

> What sort of control structures would you want supported? Keep in mind
> that ideally we want something that is simple enough that it could be
> loaded as a client-side library on mobile, and also be reasonably sure that
> it won't introduce any security issues.


As in any type of control structure. Mustache is explicitly a template
language without control structures. You can technically implement if/else
statements and for loops using Mustache's sections, but you can't have
expressions in your if statements. Template languages like Twig are more
like an actual programming language.

Mustache also does not have template inheritance, which would be really
useful in an extensible templating system.

On a side note, there are a lot of things that both Mustache and Twig do
that are lot easier and more sensible in the latter. Filters, for example.
In Mustache using filters involves using a section and looks exactly the
same as an if statement. Using multiple filters at once involves multiple
sections. In Twig there's a specific syntax that makes filtering a lot
simpler. Also Twig has PhpStorm support, which is a plus.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-30 Thread Tyler Romeo
On Mon, Dec 30, 2013 at 5:03 PM, Thomas Gries  wrote:

> Can you explain this briefly, or send me a pointer ?
> This single info can be a help for him and others.
> (Honestly, I do not know, what a "trusted" account/user is.)
> I am on #mediawiki now
>

There is a special permission that allows specific accounts to not be
affected by IP blocks. It is granted by application on a case-by-case
basis. You can find more information here:
https://en.wikipedia.org/wiki/Wikipedia:IP_block_exemption. I am not sure
whether similar processes exist on other wikis.

As for the original topic, this has been thoroughly discussed before, and
every time I forget what the result of the discussion is. I know for sure
that since MediaWiki is fundamentally centered around knowing users' IP
addresses in order to stop sockpuppets, simply allowing Tor users to edit
will not happen. We need a solution that allows us to know a Tor user's IP
address without actually known their IP address. If that sounds like a
difficult problem, it's because it is. One suggestion was to use a type of
token authentication, where we use RSA blinding in order to give anonymous
exemption tokens. Another suggestion was to simply abandon IP blocks, since
users can easily enough change their IP addresses anyway.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] $wgDBmysql5

2013-12-30 Thread Tyler Romeo
As the subject implies, there is a variable named $wgDBmysql5. I am
inclined to believe that the purpose of this variable is to use features
only available in MySQL 5.0 or later. The specific implementation affects
the encoding in the database.

https://www.mediawiki.org/wiki/Manual:$wgDBmysql5

Right now MediaWiki installation requirements say you need MySQL 5.0.2 or
later. So my question is: is this configuration variable still necessary,
or should it be deprecated?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-30 Thread Tyler Romeo
On Mon, Dec 30, 2013 at 5:23 PM, Thomas Gries  wrote:

> This is, why we (or some core) MediaWiki developers should also attend
> such congresses like the C3 regularly.
>

Times like these living in USA is inconvenient.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-30 Thread Tyler Romeo
On Mon, Dec 30, 2013 at 6:08 PM, Rjd0060  wrote:

> Shouldn't the discussion *not* be happening on Bugzilla, but somewhere
> where the wider community is actually present?  Perhaps Meta?
>

Well the issue is not whether we want Tor users editing or not. We do. The
issue is finding a software solution that makes it possible.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-30 Thread Tyler Romeo
On Mon, Dec 30, 2013 at 6:49 PM, Risker  wrote:

> I disagree fundamentally with your position here. It's technically possible
> for Tor editors to edit; all we have to do is unblock Tor nodes (or for
> them to disable Tor), and they can edit. It is the social and policy-based
> processes that prevent Tor users from using Tor to edit.  I happen to agree
> with those processes (having cleaned up major messes from unblocked Tor
> nodes on enwiki), but it's not a technical problem, really.
>

I'm confused exactly what you are disagreeing with? The consensus currently
is that Tor users should not be able to edit raw. Thus the issue at hand is
that there is currently no technical solution for allowing Tor users to
edit while still being able to block them. If you want to change the
consensus and unblock Tor users from editing, then it is indeed a
social/policy issue.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-30 Thread Tyler Romeo
On Mon, Dec 30, 2013 at 7:34 PM, Chris Steipp  wrote:

> I was talking with Tom Lowenthal, who is a tor developer. He was trying to
> convince Tilman and I that IP's were just a form of collateral that we
> implicitly hold for anonymous editors. If they edit badly, we take away the
> right of that IP to edit, so they have to expend some effort to get a new
> one. Tor makes that impossible for us, so one of his ideas is that we shift
> to some other form of collateral-- an email address, mobile phone number,
> etc. Tilman wasn't convinced, but I think I'm mostly there.
>

This is a viable idea. Email addresses are a viable option considering they
take just as much (if not a little bit more) effort to change over as IP
addresses. We can take it even a step further and only allow email
addresses from specific domains, i.e., we can restrict providers of
so-called "throwaway emails". Probably won't accomplish too much, but in
the end it's all just a means of making it more difficult for vandals. It
will never be impossible.


> We probably don't want to do that work in MediaWiki, but with OAuth, anyone
> can write an editing proxy that allows connections from Tor, ideally
> negotiates some kind of collateral (proof of work, bitcoin, whatever), and
> edits on behalf of the tor user. Individuals can still be held accountable
> (either blocked on wiki, or you can block them in your app), or if your app
> lets too many vandals in, we'll revoke your entire OAuth consumer key.
>

It is definitely outside of core scope, but is it within OAuth scope? If
anything I think it would be some sort of separate extension that relies on
OAuth, but is not actually part of OAuth itself.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2013-12-30 Thread Tyler Romeo
On Mon, Dec 30, 2013 at 9:48 PM, Gregory Maxwell  wrote:

> Digging up an old proposal of mine…


Relevant: https://bugzilla.wikimedia.org/show_bug.cgi?id=3729#c3

I've attempted implementing this proposal before (about a year ago). The
inherent issue, though, is that unless you make is so that each account can
get one and only one token, you're not actually solving the problem: making
sure Tor users have a one-to-one relation to a real IP address (or
collateral, as Chris calls it).

Also, I don't know the specifics, but the fundraising team has a whole
bunch of requirements they go through in order to be certified for security
and whatnot (somebody correct me if I'm wrong). They have a very strict set
of software policies they must follow. Implementing a system like this to
work with donations would be extraordinarily difficult, if it's even
possible.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Captcha filter list

2013-12-31 Thread Tyler Romeo
It's a CAPTCHA, not an article or piece of actual content. If people are
actually getting offended by randomly generated CAPTCHAs I think they need
to find something more worthwhile to complain about.

-- 
Tyler Romeo
On Jan 1, 2014 12:27 AM, "Benjamin Lees"  wrote:

> There's a blacklist that has been included with FancyCaptcha for a few
> months, although I don't know whether it's the same as the one the WMF
> uses.
>
> See https://bugzilla.wikimedia.org/show_bug.cgi?id=21025 and the
> associated
> patches.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgDBmysql5

2014-01-01 Thread Tyler Romeo
On Tue, Dec 31, 2013 at 7:56 PM, Platonides  wrote:

> I think it is, for schemas created without it.


Yeah. What I mean is should it be deprecated for new installations.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgDBmysql5

2014-01-01 Thread Tyler Romeo
On Thu, Jan 2, 2014 at 1:20 AM, VP Singh  wrote:

> Donot send me spam


That email was not spam. If you don't want to receive Wikimedia development
emails, you can unsubscribe from the mailing list.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File sources and spam blacklist

2014-01-02 Thread Tyler Romeo
On Thu, Jan 2, 2014 at 10:36 AM, Tuszynski, Jaroslaw W. <
jaroslaw.w.tuszyn...@leidos.com> wrote:

> https://commons.wikimedia.org/wiki/File:Masopust_dr%C5%BE%C3%ADme_14.jpg



This link is now giving me an internal exception

MediaWiki internal error.
> Exception caught inside exception handler.
> Set $wgShowExceptionDetails = true; at the bottom of LocalSettings.php to
> show detailed debugging information.



*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I'm back from Hacker School

2014-01-04 Thread Tyler Romeo
Welcome back!

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Sat, Jan 4, 2014 at 9:55 AM, Aarti K. Dwivedi
wrote:

> Welcome back Sumana! :)
>
> Cheers,
> Aarti
>
>
> On Sat, Jan 4, 2014 at 8:19 PM, Nasir Khan  wrote:
>
> > Welcome back :)
> >
> >
> >
> > *-- **Nasir Khan Saikat* <https://google.com/+NasirKhan>
> > www.nasirkhn.com
> >
> >
> >
> > On Sat, Jan 4, 2014 at 5:38 PM, Peter Coombe  > >wrote:
> >
> > > Hi Sumana, nice to have you back! Hope the sabbatical was a great
> > > experience for you.
> > >
> > > Peter
> > >
> > >
> > > On 4 January 2014 03:48, Sumana Harihareswara 
> > > wrote:
> > >
> > > > Hi! As of yesterday, I'm back after my three-month sabbatical at
> Hacker
> > > > School. I'm in catchup mode so I haven't yet resubscribed to most
> > lists,
> > > > nor quite taken back over Engineering Community Team (Quim Gil is
> still
> > > in
> > > > charge until sometime next week when I feel back up to speed).
> > > >
> > > > Thank you to WMF for the sabbatical program, thanks to my boss Rob
> > > Lanphier
> > > > for his support, and thanks to my team. I was able to walk away
> > > worry-free
> > > > for three great months because I knew that Andre Klapper, Quim Gil,
> and
> > > > Guillaume Paumier had my back. :)
> > > >
> > > > Sumana Harihareswara
> > > > Engineering Community Manager
> > > > Wikimedia Foundation
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> Aarti K. Dwivedi
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Shahyar Ghobadpour joins Wikimedia Core features team as Software Engineer

2014-01-06 Thread Tyler Romeo
On Mon, Jan 6, 2014 at 2:17 PM, Terry Chay  wrote:

> Shahyar G(whatever)


XD

Welcome aboard!

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is Foxway a right way?

2014-01-13 Thread Tyler Romeo
How does this compare to the PECL runkit extension? Also have you
benchmarked it against Scribunto? Because Scribunto does kind of the same
exact thing except just with a different programming language (and
Scribunto uses a native interpreter rather than one written in PHP).

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Mon, Jan 13, 2014 at 3:58 AM, Pavel Astakhov  wrote:

> Hi! I would like to discuss an idea.
>
> In MediaWiki is not very convenient to docomputingusing the syntax of the
> wiki. We have to use several extensions like Variables, Arrays,
> ParserFunctions and others. If there are a lot of computing, such as data
> processing received from Semantic MediaWiki, the speed of page construction
> becomes unacceptable. To resolve this issue have to do another extension
> (eg Semantic Maps displays data from SMW on Maps). Becomes a lot of these
> extensions, they don't work well with each other and these time-consuming
> to maintain.
>
> I know about the existence of extension Scribunto, but I think that you
> can solve this problem by another, more natural way. I suggest using PHP
> code in wiki pages, in the same way as it is used for html files. In this
> case, extension can be unificated. For example, get the data from
> DynamicPageList, if necessary to process, and transmit it to display other
> extensions, such as Semantic Result Formats.This will give users more
> freedom for creativity.
>
> In order to execute PHP code safely I decided to try to make a controlled
> environment. I wrote it in pure PHP, it is lightweight and in future can be
> included in the core. It can be viewed as an extension Foxway. The first
> version in branch master. It gives an idea of what it is possible in
> principle to do and there's even something like a debugger. It does not
> work very quickly and I decided to try to fix it in a branch develop. There
> I created two classes, Compiler and Runtime.
>
> The first one processes PHP source code and converts it into a set of
> instructions that the class Runtime can execute very quickly. I took a part
> of the code from phpunit tests to check the performance. On my computer,
> pure PHP executes them on average in 0.0025 seconds, and the class Runtime
> in 0.05, it is 20 times slower, but also have the opportunity to get even
> better results. I do not take in the calculation time of class Compiler,
> because it needs to be used once when saving a wiki page. Data returned
> from this class is amenable to serialize and it can be stored in the
> database. Also, if all the dynamic data handle as PHP code, wiki markup can
> be converted into html when saving and stored in database. Thus, when
> requesting a wiki page from the server it will be not necessary to build it
> every time (I know about the cache). Take the already prepared data (for
> Runtime and html) and enjoy. Cache is certainly necessary, but only for
> pages with dynamic data, and the lifetime of the objects in it can be
> greatly reduced since performance will be higher.
>
> I also have other ideas associated with the use of features that provide
> this realization. I have already made some steps in this direction and I
> think that all of this is realistic and useful.
> I'm not saying that foxway ready for use. It shows that this idea can work
> and can work fast enough. It needs to be rewritten to make it easier to
> maintain, and I believe that it can work even faster.
>
> I did not invent anything new. We all use the html + php. Wiki markup
> replaces difficult html and provides security, but what can replace the
> scripting language?
>
> I would like to know your opinion: is it really useful or I am wasting my
> time?
>
> Best wishes. Pavel Astakhov (pastakhov).
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is Foxway a right way?

2014-01-13 Thread Tyler Romeo
On Mon, Jan 13, 2014 at 12:06 PM, Daniel Friesen  wrote:

> Actually, not to nitpick... ok, no, yeah I'm going to nitpick.
> Rewriting MediaWiki in any well accepted programming language besides
> PHP would have an extremely good chance of making it faster (well
> perhaps with the exception of our parser).
> I probably wouldn't pick Lua specifically as a target for rewriting, I'd
> probably Python or Node.js, bonus points if you use a flavor of Python
> that works async like gevent, Twisted, Tornado, etc...
>

This is not true. *Maybe* Python would be a good language to implement
MediaWiki in, but recommending something like node.js for a website that
needs to scale as big as Wikipedia will just not work (not to mention it's
a terrible decision to voluntarily program in Javascript).

PHP's stateless per-request design is done on purpose, and it is not worse
than a global state design; it's just different. PHP replaces global state
with things like job queues and caches, and it tends to work pretty well.
If anything it makes development easier because, not surprisingly, managing
global state is incredibly difficult.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Tyler Romeo
On Tue, Jan 14, 2014 at 11:07 AM, Chad  wrote:

> What version is available via PEAR? Installing via that is no more
> manual than apt.
>

Also don't forget composer as well.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-01-17 Thread Tyler Romeo
On Fri, Jan 17, 2014 at 1:21 PM, Erik Moeller  wrote:

> I tested the existing process by creating a new riseup.net email
> account via Tor, then requesting account creation and a global
> exemption via stewa...@wikimedia.org. My account creation request was
> granted, but for exemption purposes, I was requested to go through the
> process for any specific wiki I want to edit. In fact, the account was
> created on Meta, but not exempted there.
>

I feel like a much better experiment would be to:

1) Do what you just did
2) Request Tor access on a specific wiki
3) Edit for a while and become an established editor
4) Then ask for a global exemption

If anything it is good for stewards to not randomly grant global exemptions
to anybody who walks in off the street.

If anything I would try testing out the enwiki-specific exemption process
and see how that works out for you.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Review Milestone reached

2014-01-18 Thread Tyler Romeo
On Fri, Jan 17, 2014 at 10:03 PM, bawolff  wrote:

> Aaron Schulz has become the first person to have approved (+2'ed) >=
> 1000 patchsets to mediawiki core
>

Wow it takes real skill to out-review the l10-bot. I'm going to have to
step up my game. ;)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to retrieve the page, execute some time expensive operation and edit the page ONLY if it wasn't changed meanwhile

2014-01-31 Thread Tyler Romeo
On Fri, Jan 31, 2014 at 8:38 PM, Mark Holmquist wrote:

> http://restpatterns.org/HTTP_Headers/If-Unmodified-Since
>
> It should be passed a value that matches a Last-Modified value that we
> get from the previous "fetch page contents" API call.


Unfortunately, MediaWiki refuses to obey the HTTP spec. Hence the reason
why we don't have E-Tag support either.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)

2014-02-01 Thread Tyler Romeo
On Sat, Feb 1, 2014 at 12:12 PM, Jan Zerebecki
wrote:

> What would be involved for someone to misuse _without_ Tor?:
> 1) Find a free WiFi spot you haven't used before.
> 2) Create account (no need to enter any Email).
> 3) Abuse and repeat if you get banned.
> I tried 1 and 2, obviously not doing 3. It worked on my first try. I
> wouldn't have thought it would be that easy.
>

Your scenario is based on the premise that Wikipedia vandals care enough
about vandalizing Wikipedia that they would get in their car (assuming
they're old enough to have a license), drive to the nearest Starbucks,
vandalize Wikipedia, and then drive somewhere else when they are blocked.

Most vandals don't put that much effort into it. I would argue that abusing
free WiFi hotspots is actually harder than abusing Tor, because it involves
physically moving from location to location, as opposed to pressing a
button and resetting my identity. Keep in mind we are not trying to
permanently block vandals, because that's impossible. We're just trying to
make vandalism difficult enough so it is no longer worthwhile.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-04 Thread Tyler Romeo
On Wed, Feb 5, 2014 at 2:20 AM, MZMcBride  wrote:

> Ultimately, account security is a user's prerogative. [...] Banks and even
> e-mail
> providers have reason to implement stricter authentication requirements.
>

This is conflicting logic. If it is the user's job to enforce their own
account security, what reason would banks or email providers have to
require long passwords? If somebody guesses a user's password and empties
their bank account, the bank could care less, since it is the customer's
fault for not making sure their password is long enough.

Rather account security, and security in general, is a combination of both
administrative oversight and user awareness. It is the system
administrators' responsibility to try and make up for the fact that users
are not security experts, and thus cannot be expected to take every
possible measure to ensure the safety of their account. Accordingly it is
our responsibility to set a password policy that ensures that users will
not do something stupid, as all users are inclined to do.

Of course, it is still valid that a Wikimedia wiki account is "nearly
valueless". However, that is probably more of a personal opinion than it is
a fact. I'm sure a very heavy Wikipedia editor, who uses his/her account to
make hundreds of edits a month but isn't necessarily an administrator or
other higher-level user, sees their account as something more than a
throwaway that can be replaced in an instant. Sure there is nothing of
monetary value in the account, and no confidential information would be
leaked should the account become compromised, but at the same time it has a
personal value.

For example, MZMcBride, what if your password is "wiki", and somebody
compromises your account, and changes your password and email. You don't
have a committed identity, so your account is now unrecoverable. You now
have to sign up for Wikipedia again, using the username "MZMcBride2". Of
course, all your previous edits are still accredited to your previous
account, and there's no way we can confirm you are the real MZMcBride, but
at least you can continue to edit Wikipedia... Obviously you are not the
best example, since I'm sure you have ways of confirming your identity to
the Wikimedia Foundation, but not everybody is like that. You could argue
that if you consider your Wikipedia account to have that much value, you'd
put in the effort to make sure it is secure. To that I say see the above
paragraph.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-05 Thread Tyler Romeo
On Wed, Feb 5, 2014 at 4:12 AM, Nathan Larson wrote:

> What if all of the email addresses that a user has ever used were to be
> stored permanently? Then in the event of an account hijacking, he could say
> to WMF, "As your data will confirm, the original email address for user Foo
> was f...@example.com, and I am emailing you from that account, so either my
> email account got compromised, or I am the person who first set an email
> address for user Foo." The email services have their own procedures for
> sorting out situations in which people claim their email accounts were
> hijacked.
>

This is definitely something to consider, but I feel like it would involve
changing our privacy policy. Or at the very least it would cause some
controversy related to that.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-06 Thread Tyler Romeo
On Thu, Feb 6, 2014 at 3:26 PM, Brian Wolff  wrote:

> Well if we are going to go down that road, requring public/private key
> pairs would also be more secure. However i doubt either would be acceptable
> to users.
>

Actually, I think it might be better if we just have people come on down to
the San Francisco office and show their government ID. Then we have Tim or
Brion log them in personally. ;)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-06 Thread Tyler Romeo
On Thu, Feb 6, 2014 at 4:54 PM, Derric Atzrott  wrote:

> Actually to be honest, if I could login to Mediawiki with a public/private
> keypair I would actually really enjoy that.  Certainly it shouldn't be the
> default, but in a very non-joking way, I would support an initiative to add
> that as an option.


You mean kind of like this?
https://www.mediawiki.org/wiki/Extension:SSLClientAuthentication

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Core unit tests passing under HHVM 2.4

2014-02-07 Thread Tyler Romeo
Can we maybe add this into Jenkins somehow? It'd be kind of nice if we
could make sure from now on that no patches break unit tests in HHVM.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Fri, Feb 7, 2014 at 3:07 PM, Ori Livneh  wrote:

> Core's unit tests are passing integration tests on Travis CI under the
> latest release of HHVM:
> https://travis-ci.org/wikimedia/mediawiki-core/builds/18445085
>
> ---
> Ori Livneh
> o...@wikimedia.org
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Communication of decisions (was Re: Should MediaWiki CSS prefer non-free fonts?)

2014-02-16 Thread Tyler Romeo
On Sun, Feb 16, 2014 at 6:36 PM, Greg Grossmeier  wrote:

> See also: The general rule among many engineering departments at WMF is
> "If it didn't happen on the list (or somewhere similarly public and
> indexable) it didn't happen."
>
> The team I most recently heard champion that rule was the Mobile Team.
>

Agreed on this. Even on Gerrit, it is hard to keep track of possible
changes and decisions being made. The mailing list is an important medium
for any significant discussion and announcements concerning MediaWiki.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC on PHP profiling

2014-02-18 Thread Tyler Romeo
On Tue, Feb 18, 2014 at 7:02 AM, Ori Livneh  wrote:

> "A tracing infrastructure that relies on active collaboration from
> application-level developers in order to function becomes extremely
> fragile, and is often broken due to instrumentation bugs or omissions,
> therefore violating the ubiquity requirement. This is especially important
> in a fast-paced development environment such as ours."
>

I tend to agree. I'm really not a big fan of wfProfileIn/wfProfileOut.
Among it's many issues:

   - People often forget to call wfProfileOut (although this can be fixed
   by using ProfileSection)
   - It hurts readability (also can be fixed by ProfileSection, although
   only in cases where the entire function is being profiled)
   - It makes code completely dependent on MediaWiki, thus eliminating the
   possibility of separating code out into separate modules
   - It provides no more information than xhprof would (and yes, xhprof is
   meant for production use.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-18 Thread Tyler Romeo
On Tue, Feb 18, 2014 at 12:48 PM, Trevor Parscal wrote:

> I propose we drop support for PHP 5.3 soon, if possible.


Agreed, but right now both Debian oldstable and Ubuntu LTS are running on
PHP 5.3. I'm pretty sure (last time I checked) that both reach their EOL
sometime this summer, like in July or something. Once that happens we can
safely stop supporting 5.3 with the next MediaWiki release.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-18 Thread Tyler Romeo
On Tue, Feb 18, 2014 at 1:14 PM, Bryan Davis  wrote:

> Ubuntu Server LTS versions have 5 years of support, so 12.04 will not
> be EOL until April of 2017. PHP 5.3 will be EOL in July of 2014. I'm
> sure that 3 year difference will be a major pain point for the Ubuntu
> security team.
>

OK, so Ubuntu Server LTS will EOL in April 2017. Additionally, MediaWiki
1.23 LTS (our next release) is planned to EOL in May 2017. With that in
mind, I think it's fair to say that once 1.23 is released we will have the
opportunity to increase our PHP requirement.

I strongly recommend we do so. A list of nice things about 5.4 that we'd
definitely use:

   - Array literals
   - $this support in closures
   - Class member access based on expression
   - Class member access after instantiation
   - Function return value array dereferencing
   - The JsonSerializable interface
   - Improved parse_url() behavior

Of course there is traits as well, but that's more of an actual new
feature, and it will be a while before MediaWiki starts using traits
everywhere.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Special page transclusion as a rudimentary API in Scribunto/Lua modules

2014-02-18 Thread Tyler Romeo
On Wed, Feb 19, 2014 at 1:26 AM, MZMcBride  wrote:

> There are likely others. What can be done to address this issue?


Only way I can think of is to improve the Lua <-> PHP API so that users can
make the queries directly.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] We're in: GSoC 2014 and FOSS OPW round 8

2014-02-25 Thread Tyler Romeo
Well I finally got around to actually listing my new possible mentor
project:

https://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects#Removing_inline_CSS.2FJS_from_MediaWiki

Quim, if you could maybe give it a look and see if it is doable for
students, then we can list it for GSoC.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Tue, Feb 25, 2014 at 12:16 AM, Rahul Maliakkal wrote:

> Great work Quim again!!!
>
> Regards,
> Rahul
>
>
> On Tue, Feb 25, 2014 at 12:59 AM, Harsh Kothari
> wrote:
>
> > Hi Quim
> >
> > Wow its awesome news.
> >
> > Kudos to you Quim :)
> >
> > Cheers
> > Harsh
> >
> >
> >
> >
> > On Tue, Feb 25, 2014 at 12:31 AM, Quim Gil  wrote:
> >
> > > Google has just announced the organizations accepted in Google Summer
> of
> > > Code 2014. Wikimedia is one of them. Please fasten your belts.
> > >
> > > http://www.google-melange.com/gsoc/org/list/public/google/gsoc2014
> > >
> > > https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014
> > >
> > > We will organize in parallel our participation in FOSS Outreach Program
> > > for Women round 8, just like we did last year.
> > >
> > > https://www.mediawiki.org/wiki/FOSS_Outreach_Program_for_Women/Round_8
> > >
> > > Please forward the news to potential candidates in your circles.
> > >
> > > I don't think we need to beat the numbers of last year (21 participants
> > > in total) but we can do better at completing projects merged in our
> > > repositories, and we should keep the trend of increasing diversity in
> > > projects, gender and geography.
> > >
> > > --
> > > Quim Gil
> > > Technical Contributor Coordinator @ Wikimedia Foundation
> > > http://www.mediawiki.org/wiki/User:Qgil
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> >
> >
> >
> > --
> > Harsh Kothari
> > Intern at Google Summer of Code,
> > Wikimedia Foundation
> > Follow Me : harshkothari410 <https://twitter.com/harshkothari410/>
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SpecialPage::getTitle deprecated?!

2014-03-01 Thread Tyler Romeo
On Sat, Mar 1, 2014 at 4:35 PM, Jeroen De Dauw  wrote:
> And for what?
> The old method is just forwarding to the new one. All this hassle for a
> rename?

If you read the commit, the purpose is that this function is planned
to be removed entirely. The end goal is to get SpecialPage to inherit
ContextSource (or hopefully have it as a trait once we move to 5.4),
but it was impossible to do that since SpecialPage decided to
implement getTitle() differently.

-- 
Tyler Romeo
Stevens Institute of Technology, Class of 2016
Major in Computer Science

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Tyler Romeo
Hi everybody,

I cannot believe I have to say something about this, but I guess it's no
surprise.

Wikipedia has a notorious policy against edit warring, where users are
encouraged to discuss changes and achieve consensus before blindly
reverting. This applies even more so to Gerrit, since changes to software
have a lot bigger effect.

Here's a nice example:
https://gerrit.wikimedia.org/r/114400
https://gerrit.wikimedia.org/r/117234
https://gerrit.wikimedia.org/r/117247

Some key points to note here:
* The revert commit was not linked to on the original commit
* The time between the revert patch being uploaded and +2ed was a mere two
minutes
* All the reviewers on the revert patch were also reviewers on the original
patch

This is unacceptable behavior, and is extremely disrespectful to the
developers here. If you are going to revert a patch for reasons other than
a blatant code review issue (such as a fatal error or the likes), you
should *at the very least* give the original patch reviewers time to
understand why the patch is being reverted and give their input on the
matter. Otherwise it defeats the entire point of the code review process
and Gerrit in the first place.

The argument being made in this specific case is that the change broke the
workflow of mobile, and that the revert was announced on mobile-l. This is
not sufficient for a number of reasons:

1) not everybody is subscribed to mobile-l, so you cannot expect the
original reviewers to see or know about it
2) this is an issue with MobileFrontend, not MediaWiki core
3) code being merged does not automatically cause a deployment, and if code
being deployed breaks something in production, it is the operations team's
job to undeploy that change

Overall, the lesson to take away here is to be more communicative with
other developers, especially when you are negating their changes or
decisions.

Thanks in advance,
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Tyler Romeo
On Thu, Mar 6, 2014 at 4:15 PM, Steven Walling wrote:

> If your patch causes a serious UX regression like this, it's going to get
> reverted. The core patch involved was being deployed to Wikimedia sites /
> impacting MobileFrontEnd users today. If we had more time in the deployment
> cycle to wait and the revert was a simple disagreement, then waiting would
> be appropriate. It is obvious in this case no one tested the core change on
> mobile. That's unacceptable.
>

You quoted my email, but didn't seem to read it. Changes to MediaWiki core
should not have to take into account extensions that incorrectly rely on
its interface, and a breakage in a deployed extension should result in an
undeployment and a fix to that extension, not a revert of the core patch.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Tyler Romeo
On Thu, Mar 6, 2014 at 6:34 PM, Brion Vibber  wrote:

> Is there anything specific in the communications involved that you found
> was problematic, other than a failure to include a backlink in the initial
> revert?
>

I think this entire thing was a big failure in basic software development
and systems administration. If MobileFrontend is so tightly coupled with
the desktop login form, that is a problem with MobileFrontend. In addition,
the fact that a practically random code change was launched into production
an hour later without so much as a test... That's the kind of thing that
gets people fired at other companies.

But apparently I'm the only person that thinks this, so the WMF can feel
free to do what it wants.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Tyler Romeo
On Thu, Mar 6, 2014 at 9:21 PM, Jon Robson  wrote:

> I wonder in future if it might be practical useful for test failures like
> this to automatically revert changes that made them or at least submit
> patches to revert them that way it's clear how and when things should
> be reverted.
>

Or, at the very least, changes are not deployed until these tests pass. I
agree that running expensive browser tests on every Gerrit change is
unnecessary and performance-intensive, but we can expect it to be OK to run
it before new deploys.

Rob said:

> I wholeheartedly disagree with this.  Changes to core should definitely
> take into account uses by widely-deployed extensions (where
> "widely-deployed" can either mean by installation count or by end-user
> count), even if the usage is "incorrect".  We need to handle these things
> on a case by case basis, but in general, *all* of the following are options
> when a core change introduces an unintentional extension incompatibility:
> 1.  Fix the extension quickly
> 2.  Revert the change
> 3.  Undeploy the extension until its fixed to be compatible with core


I don't think you see the problem here. Consider this case as an example (I
agree that this is case-by-case, so let's limit the scope to this one).
You're forgetting that the original patch fixes a bug. In fact, it fixes a
pretty serious UX bug in my opinion (and many others who supported merging
this patch).

So to summarize, #3 is obviously not an option. For #2, are we supposed to
block core development, and let this bug persist indefinitely, because of a
much less serious bug in an extension? That really only leaves #1, but
apparently the vast minority of opponents of the original patch decided it
was a good idea to jump right over and skip to #2.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Tyler Romeo
On Thu, Mar 6, 2014 at 10:13 PM, George Herbert wrote:

> Based on the timing description here, it seems more like "Either rush 1 or
> rush 2".
>

This is also not true. Something does not have to be reverted in Gerrit in
order for it to be undeployed from production. If there was any timing
issue to consider here, I would say after a few days we'd have to reach a
solution.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-07 Thread Tyler Romeo
On Fri, Mar 7, 2014 at 4:49 PM, Matthew Flaschen wrote:

> Yes, it does.  Unless the entire branch has a serious problem (500s or
> major caching problems, etc.), we don't generally switch the entire branch
> back.
>
> That means the only option is fix or revert a commit.  The general rule is
> to do changes in master before cherry-picking to the branch.
>

What you're saying is that the software development process for MediaWiki
is so tightly coupled with the operations deployment process, that
development has to be held up because of problems in operations. That's a
problem.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-07 Thread Tyler Romeo
On Fri, Mar 7, 2014 at 5:29 PM, Greg Grossmeier  wrote:

> Or a benefit, giving third-party users confidence that the core they use
> has a quick feedback loop with real users and is thoroughly tested.
>
> It's all about perspective.
>
> From these conversations, your perspective seems to be (and please
> correct me if I'm wrong) that what WMF does with deployed code should
> have no bearing on MediaWiki core/master. And that all of our massive
> testing infrastructure equally shouldn't touch core/master.
>

There is a difference between deploying code on a test cluster and
deploying it to all production servers.


> Our "massive testing infrastructure" includes the Beta Cluster, Jenkins,
> SauceLabs, and the group0 wikis (test.wikipedia, mw.org etc). And, let's
> be realistic, the Wikimedia community as a whole. Code is never really
> tested until real users interact with it.
>

Sure, but immediately deploying untested changes to all users is a reckless
method of having real users test something.


> I probably mischaracterized your perspective, but I kinda wanted to make
> a point that this is all done with quality in mind, not the other way
> around.
>
> Do you have a recommendation on how we would 'decouple' this while also
> keeping the same short feedback loop and testing rigor that we do have
> (and intend to increase, both in rigor and in speed of feedback)?
>

I think some of the things mentioned here are good solution. The biggest
problem here is that this patch was launched almost completely untested. It
should have been caught long before it was put into production.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-07 Thread Tyler Romeo
On Fri, Mar 7, 2014 at 5:39 PM, George Herbert wrote:

> With all due respect; hell, yes, development comes in second to operational
> stability.
>
> This is not disrespecting development, which is extremely important by any
> measure.  But we're running a top-10 worldwide website, a key worldwide
> information resource for humanity as a whole.  We cannot cripple
> development to try and maximize stability, but stability has to be priority
> 1.  Any large website's teams will have the same attitude.
>
> I've had operational outages reach the top of everyone's news
> source/feed/newspaper/broadcast.  This is an exceptionally unpleasant
> experience.
>

If you really think stability is top priority, then you cannot possibly
think that the current deployment process is sane.

Right now you are placing the responsibility on the developers to make sure
the site is stable, because any change they merge might break production
since it is automatically sent out. If anything that gives the appearance
that the operations team doesn't care about stability, and would rather
wait until things break and revert them.

It is the responsibility of the operations team to ensure stability. Having
to revert something because that's the only way production will be stable
is not a proper workflow.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Tyler Romeo
On Fri, Mar 7, 2014 at 7:04 PM, Ryan Lane  wrote:

> Yes! This is a _good_ thing. Developers should feel responsible for what
> they build. It's shouldn't be operation's job to make sure the site is
> stable for code changes. Things should go more in this direction, in fact.
>

If you want to give me root access to the MediaWiki production cluster,
then I'll start being responsible for the stability of the site.

Tell me something, what about the developers of MariaDB? Should they be
responsible for WMF's stability? If they accidentally release a buggy
version, are they expected to revert it within hours so that the WMF
operations team can redeploy? Or will the operations team actually test new
releases first, and refuse to update until things start working again?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Tyler Romeo
On Sat, Mar 8, 2014 at 8:38 PM, Marc A. Pelletier  wrote:

> The answer is: no, obviously not.  And for that reason the MariaDB
> developers are not allowed to simply push their latest code on our
> infrastructure with a simple +2 to code review.
>

Yes, and my point is that MediaWiki developers shouldn't be able to do that
either! Receiving a +2 should not be the only line between a patch and
deployment. Changes should be tested *before* deployment. Nobody is saying
that developers are not responsible for writing good and working code, but
there needs to be guards against things like this happening.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Tyler Romeo
On Sat, Mar 8, 2014 at 9:48 PM, Ryan Lane  wrote:

> Wikimedia uses deployment branches. Just because someone +2/merges into
> master doesn't mean it immediately shows up on Wikimedia servers. It needs
> to go into a deployment branch, then it needs to get deployed by a person.
> Also, we use a gating model, so tests are required to pass before something
> is merged. I believe there are some tests that are essential, but take too
> long to run, so they aren't gating, but the situation isn't as dire as you
> claim.
>

OK, then how did this change get deployed if it "broke" tests?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Tyler Romeo
On Sat, Mar 8, 2014 at 10:15 PM, Ryan Lane  wrote:

> The jenkins report says it passed tests, hence why it was deployed. If
> there's other tests that aren't reporting to gerrit or if there's a test
> that needs to be added, maybe that's a post-mortem action to track?
>

Yep. Hence the reason I think maybe we should work on something like the
stuff mentioned in this thread. Maybe we should be running the mobile tests
before deployment or something. I'm not sure what the exact solution is,
but I think that would be a step in the right direction.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-10 Thread Tyler Romeo
On Mon, Mar 10, 2014 at 11:50 AM, Chris McMahon wrote:

> The problem is not that the change broke tests.  The problem is that the
> change broke account creation for Mobile view on a production wiki.
>
> Let me repeat: the change broke account creation on a production wiki.
>

It's been repeated multiple times, but I'll say it again: it is disputed as
to whether account creation was "broken". It is just a question of design
and user experience. No functionality was actually broken.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-10 Thread Tyler Romeo
On Mon, Mar 10, 2014 at 2:01 PM, Brandon Harris wrote:

> This is a fairly limited view.  The functionality was *broken*.  It failed
> to work in the way it was expected to work.  That’s what “broken” means.


I'm not going to bother repeating myself. I recommend re-reading this
thread for an explanation of how it is disputed as to whether this patch
broke anything.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vagrant Cloud

2014-03-14 Thread Tyler Romeo
Or, you can use the actually libre as in free speech solution: use
Vagrant's docker.io provisioner [0] and integrate it with OpenStack [1] so
that you can basically deploy your vagrant instances into Labs.

[0] http://docs.vagrantup.com/v2/provisioning/docker.html
[1]
http://blog.docker.io/2013/06/openstack-docker-manage-linux-containers-with-nova/

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Fri, Mar 14, 2014 at 10:43 PM, Greg Grossmeier wrote:

> But, obviously not as well integrated.
>
> --
> Sent from my phone, please excuse brevity.
> On Mar 14, 2014 7:38 PM, "Greg Grossmeier"  wrote:
>
> > 
> > > Vagrant Cloud is a new service from Hashicorp that strives to make it
> > easy
> > > to share Vagrant boxes and to collaborate on provisioned instances
> > > together. It's in very early beta, but I have started poking around a
> > > little, and it looks interesting. I don't quite know yet how it will
> work
> > > with our extensive custom plugin architecture, but I suspect that this
> is
> > > not an insurmountable problem. Hashicorp is also committed to a
> freemium
> > > model that makes the software stack free (as in speech) and the basic
> > tier
> > > of cloud services free (as in beer), so it may be possible for us to
> have
> > > tighter integration with their service without compromising our values.
> >
> > There's also a totally libre solution called pagekite
> > https://pagekite.net/
> >
> > I've used it with vagrant and it Just Works(TM).
> >
> > They have a Free for FOSS tier:
> > https://pagekite.net/signup/?more=bw#fff
> >
> >
> > Just for completeness's sake :)
> >
> > --
> > | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> > | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-30 Thread Tyler Romeo
Allow me to just put this out there: using handlerbars or mustache or
anything similar to it is a *terrible* idea for MediaWiki. {{this is
escaped}} and {{{this is not escaped}}}. The only difference is an extra
brace on each side, and considering how many developers here are also
familiar with writing wikitext, the probability of an accidental security
vulnerability would increase significantly.

If we were to use a string-based templating engine (and looking at the
progress of gwicke's work, it's more likely we'll go DOM-based), we'd want
something that, at the very least, does not give the opportunity for a
screwup like this.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Sun, Mar 30, 2014 at 5:23 AM, Nuria Ruiz  wrote:

> >>  is
> >>vulnerable, if something is set to "1234 onClick=doSomething()"
>
> > >$html = Html::element( 'div', array( 'class' => $anything ),
> $anythingElse
> >> I see. Sorry but where I disagree is that the "quote me this
> replacement"
> >> is a lawful case for the template engine.
>
> >I'm not sure I understand what you're saying here. Do you mean
> >makesafeString in your example shouldn't quote the text, but should
> instead
> >remove space characters?=
>
> What I am saying is that the parsing and escaping scheme we need is much
> simpler if you disallow the use case of passing the template engine
> something that is not data.
>
> Let me explain as this as it has to do more with correctness that with
> security per se:
> A template engine objective is to separate data from markup. In your
> example you are passing the template 'class="anything"' or
> 'onclick="something"' neither "class" nor "onclick" are data. Thus what I
> am arguing is that the template engine should not support the use case of
> "add any random attribute or javascript to my html element with the right
> set of quotes" as a "lawful" use case. The template engine should not be
> expected to parse and insert code and "onclick" is code.
>
> With a new template engine our main objective should be to separate data
> from markup, not supporting an style of coding that includes "onClick"
> attributes mixed with HTML which was prevalent years ago or css classes
> mixed with controller code.
>
> On my experience reducing use cases for template engine to just data
> handling while having specific functions that deal with links and
> translations simplifies the escaping problem greatly as you do not need
> context aware escaping. You can "js-escape" any piece of data sent to the
> engine cause you know you do not support the use case of sending javascript
> to be inserted.
>
>
> On Wed, Mar 26, 2014 at 6:41 PM, Chris Steipp 
> wrote:
>
> > On Wed, Mar 26, 2014 at 10:30 AM, Nuria Ruiz 
> wrote:
> >
> > > >Additionally, how you escape a plain parameter like class vs. an
> > > >href vs. a parameter that is inserted into a url vs. an id attribute
> are
> > > >all different escaping strategies.
> > > Urls in the template engine need to be handled on their own, sure. But
> > what
> > > template engine does not work in this fashion? There are three separate
> > > "entities" you normally deal with when doing replacement: translations,
> > > urls and plain attributes.
> > >
> > >
> > > >$html = Html::element( 'div', array( 'class' => $anything ),
> > $anythingElse
> > > I see. Sorry but where I disagree is that the "quote me this
> replacement"
> > > is a lawful case for the template engine.
> >
> >
> > I'm not sure I understand what you're saying here. Do you mean
> > makesafeString in your example shouldn't quote the text, but should
> instead
> > remove space characters?
> >
> >
> > > The line above is doing a lot
> > > more than purely templating and on my opinion it does little to
> separate
> > > data and markup. Which is the very point of having a template engine.
> > >
> > > But if you consider that one a lawful use case, you are right. The
> > example
> > > I provided does not help you.
> > >
> > >
> > > On Wed, Mar 26, 2014 at 6:15 PM, Chris Steipp 
> > > wrote:
> > >
> > > > On Wed, Mar 26, 2014 at 9:44 AM, Daniel Friesen
> > > > wrote:
> > > >
> > > > > On 2014-03-26, 9:32 AM, Nuria Ruiz wrote:
> >

Re: [Wikitech-l] MediaWiki on Google App Engine

2014-04-05 Thread Tyler Romeo
On Sat, Apr 5, 2014 at 11:47 PM, Rusty Burchfield
wrote:

> What are your goals for running on App Engine rather than a service
> like Compute Engine or EC2?
>

App Engine is a different type of service than EC2. With App Engine you
basically don't have to handle your own system administration.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SSL Cert Change?

2014-04-10 Thread Tyler Romeo
On Thu, Apr 10, 2014 at 3:25 PM, Derric Atzrott <
datzr...@alizeepathology.com> wrote:

> I just had Certificate Patrol in Firefox let me know that the SSL cert for
> Wikimedia.org was changed?  Does anyone know anything about that?  Are
> multiple
> certificates in use?
>

Probably due to the Heartbleed issue. There's another thread on this
mailing list explaining that WMF has reset all user tokens and is reissuing
SSL certificates.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Config class and 1.23

2014-04-18 Thread Tyler Romeo
I agree. I was going to attempt to fix the newest patch, but until the
semester ends I won't have a lot of time (and it seems neither does the
current patch owner).

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Fri, Apr 18, 2014 at 12:16 PM, Aaron Schulz wrote:

> I'd suggest a revert from the branch, yes.
>
>
>
> --
> View this message in context:
> http://wikimedia.7.x6.nabble.com/Config-class-and-1-23-tp5026223p5026236.html
> Sent from the Wikipedia Developers mailing list archive at Nabble.com.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Config class and 1.23

2014-04-18 Thread Tyler Romeo
On Fri, Apr 18, 2014 at 7:17 PM, Legoktm wrote:

> I would like to have Config make it into 1.23, mainly since it's an LTS,
> which would allow more extensions to take advantage of it without breaking
> backwards-compatability.


I don't mind getting the Config class into 1.23. However, at this moment,
Legoktm's patch is not merged, and while it'd be nice to get it merged
before the release, as we all know the MediaWiki review process is not
always as fast as we'd like it to be.

I'd recommend reverting the merged Config patch, and then backporting
Legoktm's patch when it's finished and merged.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] dnschain

2014-04-29 Thread Tyler Romeo
This isn't really relevant to MediaWiki, and the proposal is so ridiculous
I can only assume it is some sort of joke project.

For others seeing this thread, I found all the good quotes for you:

> DNSChain "stops the NSA"

> .dns is a meta-TLD because unlike traditional TLDs, it is not meant to
globally resolve to a specific IP [...] you cannot register a meta-TLD
because you already own them!

I think ICANN might take issue with that. (Also, a good read of RFC 3686 is
necessary here.)

> // hijack and record all HTTPS communications to this site
> function do_TLS_MITM(connection) {
> if (
> // let's not get caught by "pinning", shall we?
> isPinnedSite(connection.website, connection.userAgent)
> // never hijack those EFF nuisances, they're annoying
> || isOnBlacklist(connection.ip)
> // hijack only 5% of connections to avoid detection
> || randomIntBetween(1, 100) > 5
> )
> {
> return false;
> }
> return mitm_and_store_in_database(connection);
> }

I'd *love* to see the implementation of "mitm_and_store_in_database".

Also, fun to note that the entire application is written in CoffeeScript.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Wed, Apr 30, 2014 at 1:41 AM, James Salsman  wrote:

> Would someone please review this DNS proposal for secure HTTPS?
>
> https://github.com/okTurtles/dnschain
> http://okturtles.com/other/dnschain_okturtles_overview.pdf
> http://okturtles.com/
>
> It is new but it appears to be the most correct secure DNS solution for
> HTTPS security at present. Thank you.
>
> Best regards,
> James Salsman
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] recent changes stream

2014-05-04 Thread Tyler Romeo
Just wondering, but has any performance testing been done on different
socket.io implementations? IIRC, Python is pretty good, so I definitely
approve, but I'm wondering if there are other implementations are are more
performant (specifically, servers that have better parallelism and no GIL).

For example, Erlang with Cowboy is supposed to be a good
socket.ioimplementation, and it is truly parallel, but I've never
worked with it so
I cannot say for sure.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Sun, May 4, 2014 at 10:23 PM, Ori Livneh  wrote:

> Hi,
>
> Gerrit change Id819246a9 proposes an implementation for a recent changes
> stream broadcast via socket.io, an abstraction layer over WebSockets that
> also provides long polling as a fallback for older browsers. Comment on <
> https://gerrit.wikimedia.org/r/#/c/131040/> or the mailing list.
>
> Thanks,
> Ori
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help! Phabricator and our code review process

2014-05-05 Thread Tyler Romeo
OK, so I'm sorry if this information is duplicated anywhere, but between
the Project Management Tools review page, the Phabricator RFC, the various
sub-pages of the RFC, and the content on the Phabricator instance itself,
it would take me at least a couple of hours to organize my thoughts. So
I'll just ask directly:

Phabricator still does not work directly with Git, right? Or has that been
implemented since I last checked? If not, what is the planned workaround
for Phabricator? The default workflow is to use arcanist to merge the code
into Git directly. Does that handle merge conflicts? What is the rebase
process?

It's not that I'm opposed to the new system. I'm just confused as to what
the new workflow would actually be.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Mon, May 5, 2014 at 12:02 PM, Quim Gil  wrote:

> Hi, please check this draft plan for the next steps in the Phabricator RfC
> at
>
> https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator/Plan
>
> This aims to be a starting point for the next round of discussion to be
> held online and at the Wikimedia hackathon in Zürich this weekend. Edits,
> questions, and feedback welcome.
>
>
> On Friday, May 2, 2014, C. Scott Ananian  wrote:
>
> >
> > [cscott] James_F: I'm arguing for a middle path. devote *some*
> > resources, implement *some* interoperability, decide at *some later*
> > point when we have a more functional instance.
> >
>
> This is basically the same as "Decide now on a plan identifying the the
> blockers, commit resources to fix them, proceed with the plan unless we get
> stuck with a blocker." We have identified blockers, but we are not seeing
> any that could not be solved with some work (from the very active upstream
> and/or ourselves).
>
> We need a RfC approval to go confidently from http://fab.wmflabs.org to a
> production-like Wikimedia Phabricator. If that happens, the Platform
> Engineering team will commit resources to plan, migrate, and maintain the
> Phabricator instance that will deprecate five tools or more.
>
> The Labs instance has been setup and is being fine-tuned basically on a
> volunteering basis, which tells a lot about Phabricator's simplicity of
> administration and maintenance. As it is now, it is good enough to run
> simple projects with a short term deadline e.g.
>
> Chemical Markup for Wikimedia Commons
> http://fab.wmflabs.org/project/view/26/ (a GSoC project -- hint, hint)
>
> Analytics-EEVS
> http://fab.wmflabs.org/project/board/15/
>
> Please play with it and provide feedback. Other contributors critic with
> Phabricator are doing this, and it is being extremely helpful for
> everybody.
>
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Affiliation in username

2014-05-07 Thread Tyler Romeo
One interesting idea might be what Reddit does:

For a moderator of a subreddit, whenever they make a post it just appears
normally. However, after posting they can choose to "officiate" it. All
that does is highlight their username a different color and indicates they
are acting in their position as moderator rather than a regular user.

This idea could be applied to edits in core, and maybe posts in Flow. WMF
employees in a special user group could make an edit, and then press a
button on the history page to highlight that as an "official" edit.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Wed, May 7, 2014 at 4:22 PM, Jared Zimmerman <
jared.zimmer...@wikimedia.org> wrote:

> Affiliations change, and user names are quite difficult to change, this
> sounds like something that would be good for a structured profile, not for
> a user name.
>
>
>
> *Jared Zimmerman * \\  Director of User Experience \\ Wikimedia Foundation
>
> M : +1 415 609 4043 |   :  @JaredZimmerman<
> https://twitter.com/JaredZimmerman>
>
>
>
> On Sat, Apr 19, 2014 at 4:17 PM, Gryllida  wrote:
>
> > On a second thought, do we want to add an optional "affiliation" field to
> > the signup form, so the affiliation goes at the end of username in
> braces?
> >
> > - DGarry (WMF)
> > - Fred (DesignSolutionsInc)
> > - David (MIT)
> > - ...
> >
> > So the signup form would look like this:
> >
> >  -
> > | |
> > | [ Username preview in large green font ]|
> > | |
> > | Username:   |
> > |  ___|
> > | Password:   |
> > |  ___|
> > | Password 2: |
> > |  ___|
> > | Email (optional):   |
> > |  ___|
> > | Affiliation (optional; if your editing is related to work): |
> > |  ___|
> > | |
> >  -
> >
> > I.e.
> >
> >  -
> > | |
> > | [ "Gryllida (FOO)" in large green font ]|
> > | |
> > | Username:   |
> > |  _Gryllida__|
> > | Password:   |
> > |  ___|
> > | Password 2: |
> > |  ___|
> > | Email (optional):   |
> > |  ___|
> > | Affiliation (optional; if your editing is related to work): |
> > |  _FOO___|
> > | |
> >  -
> >
> > Gryllida.
> >
> >
> > On Sun, 23 Feb 2014, at 1:25, rupert THURNER wrote:
> > > hi,
> > >
> > > could wmf please extend the mediawiki software in the following way:
> > > 1. it should knows "groups"
> > > 2. allow users to store an arbitrary number of groups with their
> profile
> > > 3. allow to select one of the "group"s joined to an edit when saving
> > > 4. add a checkbox "COI" to an edit, meaning "potential conflict of
> > interest"
> > > 5. display and filter edits marked with COI in a different color in
> > history
> > > views
> > > 6. display and filter edits done for a group in a different color in
> > > history views
> > > 7. allow members of a group to receive notifications done on the group
> > page,
> > >or when a group is mentioned in an edit/comment/talk pag

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-16 Thread Tyler Romeo
I feel like the ideal situation would be to:

1) Only allow Phabricator login with a Wikimedia account; and
2) When logging into Wikimedia, allow login with Google, GitHub, etc.

Unfortunately, fulfilling that situation means deploying the OpenID
extension, which is definitely not ready yet.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Fri, May 16, 2014 at 1:47 PM, C. Scott Ananian wrote:

> On Fri, May 16, 2014 at 9:23 AM, Quim Gil  wrote:
> > On Friday, May 16, 2014, Petr Bena  wrote:
> >
> >> Yes. Support as many providers as possible, google at least, I
> >> basically don't even want to use any more web services with own login
> >> unless I have to. single login FTW
> >
> > I wonder why a user without a Wikimedia account or a GitHub account would
> > need to login to Wikimedia Phabricator.
>
> To report or comment on a bug?
>
> To anonymously report an issue?
>  --scott
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-17 Thread Tyler Romeo
On Sat, May 17, 2014 at 2:26 PM, Steven Walling wrote:

> Obviously with Google and Facebook as options we don't
> stand to gain a lot in terms of technical contributions.
>

This isn't necessarily true. I know that I personally would prefer to be
able to log in with my Google account, because it's what I use for
everything.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What should be the recommended / supported way to do skins? (A proposal.)

2014-05-20 Thread Tyler Romeo
On Tue, May 20, 2014 at 5:25 PM, Bartosz Dziewoński wrote:

> tl;dr Let's start putting all skins files in a single directory, and let's
> use a grown-up structure with one class per file + separate init code for
> them. Okay?


I wouldn't consider this change to be truly revolutionary. Would would
really be a great restructuring of the skinning system is if I could make a
skin by just writing a couple of HTML templates (probably using Gabriel's
DOM-based templating language), throw them in a directory and then tell
MediaWiki where the directory is.

However, staying on the topic that you brought up, I do agree with keeping
skins in a separate directory than extensions. It implies a bit of control
on our part concerning how skins need to be structured, whereas if they
were extensions, we cannot place any requirements on the structure.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Square bounding boxes + typesafe enums

2014-05-21 Thread Tyler Romeo
On Wed, May 21, 2014 at 12:48 PM, Brion Vibber wrote:

> From my read, SplEnum is a bit janky -- you still have to defibe integer
> constants and may have to pass them in.
>

Also, SplEnum is not built-in. It's a PECL extension, so we'd have to
require users to install it alongside MediaWiki.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding external libraries to core (SwiftMailer)

2014-05-27 Thread Tyler Romeo
This may be true, but like I mentioned during the IRC RFC discussion on 
third-party
components, it’s a case-by-case decision, and I think in this case, SwiftMailer 
is
a fairly reliable library and definitely much better than what we have 
implemented
now.
-- 
Tyler Romeo
0xC86B42DF

From: Tony Thomas 01tonytho...@gmail.com
Reply: Wikimedia developers wikitech-l@lists.wikimedia.org
Date: May 27, 2014 at 12:13:16
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject:  Re: [Wikitech-l] Adding external libraries to core (SwiftMailer)  

So, if we can have the swiftmailer repo added, this would be excellent. It
might get worse
otherwise, as after this shift, MW can 'only' send emails through swift. A
hard dependency,
as you quoted earlier.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deleting config directory confusion

2014-05-31 Thread Tyler Romeo
delete it optionally for purpose X?
You can use the config script to perform database upgrades later on if you 
upgrade MediaWiki versions. Most people just use the command line update.php 
instead, but for some people this is not an option.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new auth/debug/Zero RfCs, + adding more office hours

2014-06-02 Thread Tyler Romeo
* https://www.mediawiki.org/wiki/Requests_for_comment/SOA_Authentication
"With many more entry points and the need for inter-service
authentication, a service-oriented architecture requires a stronger
authentication system."
This is literally the same thing as AuthStack except more generic and without 
any code or plan for implementation.

Also, on the same note, I was told previously that the AuthStack RFC was too 
big and needed to be split up, because I tackled both 
authentication/authorization *and* session management and logout in the same 
proposal. Since this RFC has the same problem, it should be split up 
accordingly.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Tyler Romeo
I would also like to express my disappointment at third party users being
thrown under the bus once again. Several people have been putting a lot of
effort into supporting third party users, so it really saddens me that this
is dismissed as an irrelevance by some so easily.
Third party users were not thrown under the bus. Unfortunately, the solution 
you are looking for in terms of extension installation is not yet available 
with the current tools available. That is just the unfortunate truth. We are 
not going to misuse libraries and hack together MediaWiki just so extension 
installation can be *slightly* easier.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Tyler Romeo
This sort of behaviour towards non-WMF extension developers is
interesting and if your objective is to alienate (as with the attitude
above) volunteer developers then your on the right path.

Some people forget that users not always choose MW because of its
software but because it provides some extensions that people outside
of the WMF cluster find useful.
Considering I *am* a non-WMF extension developer I don’t see how your argument 
is relevant.

And as I literally just said in my previous, the goal was not to disadvantage 
third-party extension developers. Composer is simply not meant to be used in 
the way it was shoe-horned into MediaWiki. I’m not going to re-explain this 
every time because it is in multiple places on Bugzilla and in this mailing 
list.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-04 Thread Tyler Romeo
Starwman. I happen to have discussed the situation and the approach with
the main people behind Composer in person, as well as gone over details
with contributors on IRC. They did not seem to share your opinion.
Since we’re throwing out logical fallacies: argumentum ab auctoritate.

Like I’ve *already explained*…

Here is the problem: we need composer.json and composer.lock to be version 
controlled so that MediaWiki core can manage its own dependencies rather than 
using git submodules or hard-copying source trees into the repository, which 
everybody agrees are not sustainable solutions. However, third parties want to 
be able to install extensions via Composer, which, while not the purpose of 
Composer, is technically feasible. These two ideas conflict.

Here is the solution: rather than using Composer as a package management system 
when, in reality, it is a dependency management system, we use Composer to 
properly maintain core, and then do one of the following: 1) implement our own 
extension installation system from scratch, or 2) change and/or extend Composer 
so that it supports a plugin system. I personally recommend the latter, and 
there are upstream bug reports open concerning it.

-- 
Tyler Romeo
0xC86B42DF

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

<    1   2   3   4   5   6   7   8   >