Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Brian Wolff
On Jan 14, 2014 8:20 PM, "Nathan Larson"  wrote:
>
> On Tue, Jan 14, 2014 at 6:22 PM, Matthew Flaschen
> wrote:
>
> > That's not the case.  There are components for software the WMF does not
> > use.  This ranges from major projects like Semantic MW to one-off
> > extensions that WMF does not have a use for (e.g. Absentee Landlord) to
> > tools that work *with* MW but are not part of it (e.g. Tools Labs tools,
> > Pywikibot).
>
>
> I guess it would depend on making the scope of the bug broad enough that
it
> would seem useful for more than just one site. E.g. one could put
> "implement the functionality needed for Inclupedia". That functionality
> could be reused for any number of sites, just like SMW's code. On the
other
> hand, if someone were to say "Switch configuration setting x to true on
> Inclupedia" that would be of little interest to non-Inclupedia users, I
> would think. I assume that Bugzilla is not intended as the place for all
> technical requests for the entire wikisphere?
>

Yeah, i think shell-type requests for non-wmf wikis has traditionally been
out of scope for our bugzilla (possible exception: acawiki). OTOH I dont
know if anyone has ever really asked to use our bugzilla in such a manner,
so maybe opinions differ.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Nathan Larson
On Tue, Jan 14, 2014 at 6:22 PM, Matthew Flaschen
wrote:

> That's not the case.  There are components for software the WMF does not
> use.  This ranges from major projects like Semantic MW to one-off
> extensions that WMF does not have a use for (e.g. Absentee Landlord) to
> tools that work *with* MW but are not part of it (e.g. Tools Labs tools,
> Pywikibot).


I guess it would depend on making the scope of the bug broad enough that it
would seem useful for more than just one site. E.g. one could put
"implement the functionality needed for Inclupedia". That functionality
could be reused for any number of sites, just like SMW's code. On the other
hand, if someone were to say "Switch configuration setting x to true on
Inclupedia" that would be of little interest to non-Inclupedia users, I
would think. I assume that Bugzilla is not intended as the place for all
technical requests for the entire wikisphere?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Matthew Flaschen

On 01/13/2014 01:59 PM, Nathan Larson wrote:

I can't exactly post a bug to MediaZilla saying "Create Inclupedia" and
then have a bunch of different bugs it depends on, because non-WMF projects
are beyond the scope of MediaZilla.


That's not the case.  There are components for software the WMF does not 
use.  This ranges from major projects like Semantic MW to one-off 
extensions that WMF does not have a use for (e.g. Absentee Landlord) to 
tools that work *with* MW but are not part of it (e.g. Tools Labs tools, 
Pywikibot).


It's true everything on Bugzilla is related to MediaWiki or the WMF in 
some way, though (some more direct than others).


Matt Flaschen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-13 Thread Nathan Larson
On Mon, Jan 13, 2014 at 2:27 PM, Brian Wolff  wrote:

> Id say that http://getwiki.net/-GetWiki:1.0 was similar to your "superset"
> concept (minus the merging part)


Yeah, per WikiIndex , "Instead of red links,
GetWiki uses green links to point to articles which do not exist locally.
When the user follows such a link, GetWiki tries to dynamically fetch it
from the wiki designated as an external source (in Wikinfo's case, the
English Wikipedia), renders and displays the article text. A local copy is
created only if the page is edited. Effectively, Wikinfo therefore provides
a transparent 'wrapper' around Wikipedia pages which have not yet been
copied."

That sounds pretty easy to implement, compared to what is contemplated for
Inclupedia. If a page had been deleted from Wikipedia, presumably it
wouldn't have been accessible on Wikinfo unless someone had created a fork
of that page on Wikinfo prior to its deletion from Wikipedia. That's a
major absence of Inclupedia (and Deletionpedia) functionality.

Also, a high-traffic live mirror would be contrary to live
mirrorspolicy, as the
load on WMF servers would increase at the same rate as the
wiki's readership. A site that only polled WMF servers for changes, and
stored local copies of those changes, would not have that problem. To the
contrary, it might take load off of WMF servers, if some readers were to
retrieve mirrored Wikipedia pages from Inclupedia that they would have
otherwise retrieved from Wikipedia.

The approach used by GetWiki of combining a Wikipedia mirror with locally
stored forks is probably more suitable for either (1) a general
encyclopedia with a non-neutral viewpoint (e.g. a Sympathetic Point of
View), or (2) a site that wishes to have a narrower focus than that of a
general encyclopedia. E.g., if you ran a site like Conservapedia
(conservative bias) or Tampa Bay Wiki (narrow focus), while you were
building up the content, you might want to be able to wikilink to articles
non-existent on your local wiki, such as "Florida", and dynamically pull
content from Wikipedia for users who click on those links. In the case of
Conservapedia, Wikipedia's "Florida" content might be considered better
than nothing, pending the creation of a forked version of that content that
would be biased conservatively. In the case of Tampa Bay Wiki, the content
of the "Florida" article might be sufficient for their purposes, so they
could just keep serving the mirrored content from Wikipedia forever.

If a site were to aspire to be a general encyclopedia with a neutral point
of view, it would be better to discourage or disable, as much as possible,
forking of Wikipedia's articles. Once an article is forked, it will require
duplication of Wikipedians' labor in order to keep the content as
well-written, comprehensive, well-researched, and in general as
high-quality and up-to-date as Wikipedia's coverage of the subject. It
would be better to instead mirror those articles, and have users go to
Wikipedia if they want to edit them. If users edit articles that have been
deleted from Wikipedia, on the other hand, there is no forking going on,
and therefore no duplication of labor. Unnecessary duplication of
Wikipedians' labor tends to demoralize and distract users from building up
complementary content; therefore, NPOV general encyclopedia community
builders should consider it anathema.

GetWiki and Wikinfo don't appear to have prospered. It would
seemthat GetWiki was created in 2004 and
that Wikinfo abandoned it in March
2007 and switched to MediaWiki. According to
Wikipedia,
"In 2013, the content was removed without explanation". I'm glad I didn't
devote too much labor to creating content on Wikinfo. GetWiki.net itself has
had no activity since 2012 .

The main problem with a NPOV general encylopedias' forking Wikipedia is
that, for the purposes of most people, there's not a very compelling reason
to do it. They deem the quality of the product Wikipedia offers, within the
purview of its coverage (viz. NPOV notable topics) to be good enough that
it's not worthwhile to create a fork. Wikipedia's shortcoming, from the
standpoint of inclusionists, is not so much insufficient quality of
articles, as insufficient quantity of topics covered. Forking is more
suitable for those who find the quality insufficient to meet their
particular needs.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-13 Thread Brian Wolff
> People sometimes ask, "Hasn't this already been done?" It would seem that
> it hasn't, which is why so much of the implementing code has to be
designed
> and developed rather than borrowed or reverse-engineered. In some ways,
the
> closest project to this one may have been been the various proprietary
> sites that used the Wikimedia update feed
> serviceto
> stay continuously up-to-date with Wikipedia, but to my knowledge none
> of
> them used MediaWiki as their engine, and their inner workings are a
> mystery. Those also tended to be read-only rather than mass collaborative
> sites.

Id say that http://getwiki.net/-GetWiki:1.0 was similar to your "superset"
concept (minus the merging part)

>
> there will inevitably arise completely Inclupedia-specific matters that
> need to be dealt with in a different venue. Presumably, it'll be necessary
> to create a whole new infrastructure of bug reporting, mailing lists, IRC
> channels, etc. But, I want to get it right from the beginning, since this
> is an opportunity to start from scratch (e.g. maybe there is a better code
> review tool than Gerrit?) I have created Meta-Inclu as a venue for project
> coordination.

Be careful here - well its important to have bug tracker, etc -
concentrating too much on support infrastructure and not enough on the
actual issue at hand is a way that new projects sometimes fail. These types
of things also  tend not to be needed by small just-starting-out projects
in the same way that large projects need them. Of course every project is
different and you are in the best position to evaluate your project's needs.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Inclupedia: Developers wanted

2014-01-13 Thread Nathan Larson
TLDR: I seek fellow developers with whom to collaborate on creating the
largest and most inclusive wiki in the world, Inclupedia.
http://meta.inclumedia.org/wiki/Inclupedia

Inclupedia is a project to make available, on an
OpenEditwiki, pages deleted
from Wikipedia for notability reasons, as well as
articles created from scratch on Inclupedia. Thus, it will combine elements
of Deletionpedia  and
Wikinfoas well as the various Wikipedia
mirrors . It
will be, in other words, a supplement to, and an up-to-date mirror of,
Wikipedia content.

Inclupedia seeks to accomplish the entire
inclusionistagenda
through technical means, rather than through a political solution
that would require persuading deletionists and moderates to change their
wiki-philosophies. Inclupedia will have no notability requirements for
articles, and will let people post (almost) whatever they want in
userspace. It will also, however, learn from the failures of the various
Wikipedia forks that could not sustain much activity because they had no
way of becoming a comprehensive encyclopedia without duplicating
Wikipedians' labor.

Complete, seamless, and continuous integration with Wikipedia is required.
That is what will enable Inclupedia to be different from those ill-fated
aspirants to the throne whose abandoned, rotting carcasses now litter the
wikisphere. Inclupedia aspires to be the largest and most inclusive wiki in
the world, since the set of Inclupedia pages (and revisions) will always be
a superset  of the set of
Wikipedia pages (and revisions).

People sometimes ask, "Hasn't this already been done?" It would seem that
it hasn't, which is why so much of the implementing code has to be designed
and developed rather than borrowed or reverse-engineered. In some ways, the
closest project to this one may have been been the various proprietary
sites that used the Wikimedia update feed
serviceto
stay continuously up-to-date with Wikipedia, but to my knowledge none
of
them used MediaWiki as their engine, and their inner workings are a
mystery. Those also tended to be read-only rather than mass collaborative
sites.

The core of what needs to be done is (1) developing a bot(s) to pull
post-dump data from Wikipedia and push it to Inclupedia, (2) developing
capability to merge received data into Inclupedia without losing any data
or suffering inconsistencies, e.g. resulting from collisions with existing
content (as might happen, e.g. in mirrored page moves involving
destinations that already exist on Inclupedia), and (3) developing all the
other capabilities involved in running a site that's both a mirror and a
supplement, e.g. locking mirrored pages from editing by Inclupedians
(unless there will be forking/overriding capability).

I can't exactly post a bug to MediaZilla saying "Create Inclupedia" and
then have a bunch of different bugs it depends on, because non-WMF projects
are beyond the scope of MediaZilla. Some bugs (e.g. bug
59618)
concerning Inclupedia-reliant functionality are already in MediaZilla, but
there will inevitably arise completely Inclupedia-specific matters that
need to be dealt with in a different venue. Presumably, it'll be necessary
to create a whole new infrastructure of bug reporting, mailing lists, IRC
channels, etc. But, I want to get it right from the beginning, since this
is an opportunity to start from scratch (e.g. maybe there is a better code
review tool than Gerrit?) I have created Meta-Inclu as a venue for project
coordination.

Mostly, I would like help with design decisions, code review, etc. It's
such a big project, it seems almost overwhelming to contemplate doing
singlehandedly, but it's probably doable if there are a few people involved
who can bounce ideas off one another, provide moral support, etc. So, if
you are interested, feel free to email back or create an account at
Meta-Inclu, and we can begin discussing the details of implementation.
http://meta.inclumedia.org/

If there were to be insufficient volunteer support for implementing this
wiki, then the next step might be to try to get funding to pay developers.
There's no guarantee that such funding would be obtainable, though, or that
it wouldn't come with significant strings attached, that would conflict
with the basic principles and vision of the site, or lead to a lot of (what
I might consider) undesirable technical decisions being made. But we do
what we have to do to make what we are passionate about a reality, to the
extent that's possible given the resources at hand. Thanks,

-- 
Nathan Larson 
__