[Wikitech-l] (no subject)

2021-08-18 Thread Siddhi Bhanushali
Can I get github link of Wikimedia.
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Re: [Wikitech-l] (no subject)

2021-03-11 Thread Soham Parekh
Hi Yasharth,

Excited to have your interest in Wikimedia for this year's Google Summer of
Code. Vidhi Mody and I shall be mentoring for this project. You can reach
out to us on Zulip. We will be creating a separate stream for Google Summer
of Code 2021 soon and we can take it from there.

Regards,
Soham Parekh


On Thu, Mar 11, 2021, 02:43 Yash Dubey  wrote:

> Hi all, I am Yasharth Dubey currently in 2nd-year undergrad studying CSE
> at IIIT Dharwad, India. I have a good experience in web development. and
> would like to contribute and look forward to participating for GSOC 2021
> opting for this org.
>
> I am very much interested in
> Upgrade WebdriverIO to the latest version for all repositories
> this project and I am looking forward to working with you for a long time.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] (no subject)

2021-03-11 Thread Vidhi Mody
Hi Yasharth,

Excited to have your interest in Wikimedia for this year's Google Summer of
Code. Soham Parekh and I shall be mentoring for this project. You can reach
out to us on Zulip. We will be creating a separate stream for Google Summer
of Code 2021 soon and we can take it from there. I'll leave the link here
for your reference: https://m.mediawiki.org/wiki/Outreach_programs/Zulip

Regards,
Vidhi Mody

On Thu, Mar 11, 2021, 2:43 AM Yash Dubey  wrote:

> Hi all, I am Yasharth Dubey currently in 2nd-year undergrad studying CSE
> at IIIT Dharwad, India. I have a good experience in web development. and
> would like to contribute and look forward to participating for GSOC 2021
> opting for this org.
>
> I am very much interested in
> Upgrade WebdriverIO to the latest version for all repositories
> this project and I am looking forward to working with you for a long time.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2021-03-10 Thread Yash Dubey
Hi all, I am Yasharth Dubey currently in 2nd-year undergrad studying CSE at
IIIT Dharwad, India. I have a good experience in web development. and would
like to contribute and look forward to participating for GSOC 2021 opting
for this org.

I am very much interested in
Upgrade WebdriverIO to the latest version for all repositories
this project and I am looking forward to working with you for a long time.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2021-02-26 Thread ธิชาเมฆภัทร มหิดล
Thichapat Somjai
Thailand
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2020-12-24 Thread KKIO OPOP
OK
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2020-08-09 Thread public bot
94107
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2019-12-16 Thread Bartosz Dziewoński

On 2019-12-16 10:46, Egbe Eugene wrote:

After writing to the VE surface from a Gadget, the page saves with just
empty '' tags with no text. I could be missing something in my
config for VE or use of the model. Please can I get a solution for this?

The sample data and the segment which writes to the model is found here[1].
[1] https://etherpad.wikimedia.org/p/sample_VE_model


Each item in the data array must be one character. I don't know how this 
ends up with '', but I get various exceptions when trying to 
type (or do anything with the editor) after running your snippet, so 
clearly something breaks terribly.


The corrected version would be like this:

var data = [
{type: 'mwHeading', attributes: {level: 2}},
'H', 'e', 'a', 'd', 'i', 'n', 'g', ' ', '2',
{type: '/mwHeading'}
];

Or, more convenient to write (using the "spread syntax" for arrays):

var data = [
{type: 'mwHeading', attributes: {level: 2}},
...'Heading 2'.split( '' ),
{type: '/mwHeading'}
];



--
Bartosz Dziewoński

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-12-16 Thread Egbe Eugene
Hi All,

After writing to the VE surface from a Gadget, the page saves with just
empty '' tags with no text. I could be missing something in my
config for VE or use of the model. Please can I get a solution for this?

The sample data and the segment which writes to the model is found here[1].


--
Eugene233

[1] https://etherpad.wikimedia.org/p/sample_VE_model
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-09-25 Thread WEILUN TEOH
gmane.org/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-08-18 Thread YAZED sd
إنشاء رسالة البريدالكتروني
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-08-18 Thread YAZED sd

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-07-04 Thread Eduardo Luke

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-05-11 Thread Aayush Sharma

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2019-05-04 Thread sellyrachmayanti24
 





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-10-26 Thread יגאל חיטרון
Hello. Please pay attention to https://phabricator.wikimedia.org/T208074.
Looks serious. Some of interwiki links do not appear at all, in all
wikipedia projects.
Igal (User:IKhitron)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-10-25 Thread Liza Rustanto
-- 
*FUSION ENTREPRISES GROUPS PTY LTD*
2/14 Bloomfield Avenue
MARIBYRNONG  VIC  3032
Phone:  (+61) 448 86 96 46
Fax:  (+61)3 8679 0330
Email:  i...@fusionentreprises.com.co
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2018-08-08 Thread Brion Vibber
I wanted to comment quickly on one thing that was called out.

Closing "valid tasks" may be appropriate depending on the task and the
context. Closed is a valid state for a task and may well be most
appropriate.

There is a reason for this, and I want to be clear:

Tasks are not isolated platonic constructs; they are tools for people to
use in their work on software. If a task and the discussion on it is
unconstructive, then closing it sounds fine.


Now that's just off the top of my head; I'm unfamiliar with the particular
case you presumably are citing.

But again I have to stress that this is not a hobby project, this is a
working environment for dozens of people building and maintaining the
software that the Wikimedia community has entrusted them with.

That's not to say volunteers aren't welcome: rather, that's to say that
volunteers are expected to behave themselves just as we are when they
choose to work alongside us.

Not sure about the language references; they don't seem relevant to the
patterns of behavior under discussion.

-- brion

On Wed, Aug 8, 2018, 4:18 PM MZMcBride  wrote:

> Brion Vibber wrote:
> >I would advise you generally to treat wikitech-l like a professional
> >workspace, which it is for those of us who are employees of WMF or WMDE.
>
> I think there's a big schism that you're pointing to here: why do you
> think it's appropriate for you or anyone else to impose your particular
> U.S. workplace standards on a global community of volunteers? For many
> users, wikitech-l, Phabricator, and other venues are specifically not
> workplaces, they're places for technical work on hobby projects.


> >If your corporate HR department would frown at you making a statement
> >about people's character or motivations with derogatory language, think
> >twice about it. Treat people with respect.
>
> Sure, treat people with respect. As a colleague of Greg Varnum, have you
> privately messaged him to let him know that closing valid tasks is
> inappropriate? Have you told him that gaslighting users into thinking that
> an obvious bug is an intentional design choice is unacceptable behavior?
> Have you discussed with Greg V. that un-assigning yourself from a task and
> attempting to orphan it is bad practice?
>
> Or are you simply focused on trying to shame and silence volunteers who
> are critical of Wikimedia Foundation Inc. employees?
>
> Regarding the word fuck generally, I've been here long enough to remember
> commits such as .
> There are also many commits and tasks that use similar language. As the
> English Wiktionary notes, "what the fuck" is a common interjection:
> . I do not
> think it's a phrase that should be commonly used on Phabricator, but at
> times it can be appropriate, _as your code commit from 2008 notes_, to
> underscore the severity of a particular issue. What Greg did was really
> bad and is compounded, in my opinion, by his continued silence and the
> lack of resolution to the issue of German text appearing on an English
> landing page. Saying what Greg V. did was confusing and bad, even
> forcefully, is not the real issue here.
>
> For what it's worth, here's Daniel using the same language in 2016:
> . And MatmaRex using
> the same language in the same year:
> . A quick search of Phabricator
> for "what the fuck", "fuck", or "wtf" shows that none of them are rare.
>
> MZMcBride
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2018-08-08 Thread MZMcBride
Brion Vibber wrote:
>I would advise you generally to treat wikitech-l like a professional
>workspace, which it is for those of us who are employees of WMF or WMDE.

I think there's a big schism that you're pointing to here: why do you
think it's appropriate for you or anyone else to impose your particular
U.S. workplace standards on a global community of volunteers? For many
users, wikitech-l, Phabricator, and other venues are specifically not
workplaces, they're places for technical work on hobby projects.

>If your corporate HR department would frown at you making a statement
>about people's character or motivations with derogatory language, think
>twice about it. Treat people with respect.

Sure, treat people with respect. As a colleague of Greg Varnum, have you
privately messaged him to let him know that closing valid tasks is
inappropriate? Have you told him that gaslighting users into thinking that
an obvious bug is an intentional design choice is unacceptable behavior?
Have you discussed with Greg V. that un-assigning yourself from a task and
attempting to orphan it is bad practice?

Or are you simply focused on trying to shame and silence volunteers who
are critical of Wikimedia Foundation Inc. employees?

Regarding the word fuck generally, I've been here long enough to remember
commits such as .
There are also many commits and tasks that use similar language. As the
English Wiktionary notes, "what the fuck" is a common interjection:
. I do not
think it's a phrase that should be commonly used on Phabricator, but at
times it can be appropriate, _as your code commit from 2008 notes_, to
underscore the severity of a particular issue. What Greg did was really
bad and is compounded, in my opinion, by his continued silence and the
lack of resolution to the issue of German text appearing on an English
landing page. Saying what Greg V. did was confusing and bad, even
forcefully, is not the real issue here.

For what it's worth, here's Daniel using the same language in 2016:
. And MatmaRex using
the same language in the same year:
. A quick search of Phabricator
for "what the fuck", "fuck", or "wtf" shows that none of them are rare.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2018-08-08 Thread Brion Vibber
Hi Dennis.

I would advise you generally to treat wikitech-l like a professional
workspace, which it is for those of us who are employees of WMF or WMDE.

If your corporate HR department would frown at you making a statement about
people's character or motivations with derogatory language, think twice
about it. Treat people with respect.

Thanks!

-- brion


On Wed, Aug 8, 2018 at 7:42 AM Dennis During  wrote:

> What kind of snowflake environment is this? Or do we use the alleged
> presence of snowflakes as a weapon against opposing views?
>
>
> >
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-08-08 Thread Dennis During
What kind of snowflake environment is this? Or do we use the alleged
presence of snowflakes as a weapon against opposing views?


>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-06-17 Thread Liza Rustanto
-- 
FUSION ENTREPRISES GROUPS PTY LTD
2/14 Bloomfield Avenue
MARIBYRNONG VIC  3032
Phone:  (+61) 448 86 96 46
Fax:  (+61)3 8679 0330
Email:  i...@fusionentreprises.com.co
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-06-10 Thread Liza Rustanto
-- 
FUSION ENTREPRISES GROUPS PTY LTD
2/14 Bloomfield Avenue
MARIBYRNONG VIC  3032
Phone:  (+61) 448 86 96 46
Fax:  (+61)3 8679 0330
Email:  i...@fusionentreprises.com.co
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-06-01 Thread Liza Rustanto
-- 
FUSION ENTREPRISES GROUPS PTY LTD
2/14 Bloomfield Avenue
MARIBYRNONG VIC  3032
Phone:  (+61) 448 86 96 46
Fax:  (+61)3 8679 0330
Email:  i...@fusionentreprises.com.co
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-05-14 Thread Liza Rustanto
-- 
Liza Rustanto
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2018-01-12 Thread Sonali Patro
Hi Everyone,

I am an undergraduate student from India, and I plan on taking part in the
google summer of code through your organization. I am interested in
frontend development, also I like working in C++. Can you please help me
out in moving forward.

Regards,

Sonali Patro

Contact: 00917381809123/0097455579707
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2016-03-27 Thread イノウエナオキ

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2016-02-16 Thread Josephine Lim
Hi all,
I've been working on a project to improve the categorization of
pictures in the Upload to Commons Android app
 as part of the Outreachy
Dec '15 program, which is soon drawing to an end. To summarize, 3 new
features have been implemented in this app:

1. If a picture with geolocation is uploaded, nearby category
suggestions are offered (based on the categories of other Commons
images with similar coordinates)

2. If a picture with no geolocation is uploaded, nearby category
suggestions are offered based on the user's current location. This is
optional and only works if enabled in Settings.

3. Category search (when typing in the search field) has been made
more flexible, whereas previously this was done solely by prefix
search. E.g. now searching for 'latte' should be able to return 'iced
latte'.

The latest version of the app is v1.11 and can be downloaded at
.
Please feel free to leave feedback or bug reports at
.

I have had an amazing time working on this app as part of the
Outreachy program, and I greatly appreciate all the support and help
that the WMF community has given me. :)


Cheers!


-- 

Regards,
Josephine
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2016-01-28 Thread イノウエナオキ

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2015-10-08 Thread Purodha Blissenbach

In a simpe message text typo fix:
https://gerrit.wikimedia.org/r/#/c/243332/
in the job
mwext-testextension-zend
the test
PHPUnit
failed with these messages:

+ php phpunit.php --with-phpunitdir 
/srv/deployment/integration/phpunit/vendor/phpunit/phpunit --log-junit 
/mnt/jenkins-workspace/workspace/mwext-testextension-zend/log/junit-phpunit-allexts.xml 
--testsuite extensions

18:43:29 You MUST install Composer dependenciesRecording test results
18:43:29 ERROR: Publisher 'Publish JUnit test result report' failed: No 
test report files were found. Configuration error?

18:43:29 [PostBuildScript] - Execution post build scripts.
18:43:29

I have no clue what do to fix it.

Purodha


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2015-10-08 Thread Antoine Musso
Le 08/10/2015 21:04, Purodha Blissenbach a écrit :
> In a simpe message text typo fix:
> https://gerrit.wikimedia.org/r/#/c/243332/
> in the job
> mwext-testextension-zend
> the test
> PHPUnit
> failed with these messages:
> 
> + php phpunit.php --with-phpunitdir
> /srv/deployment/integration/phpunit/vendor/phpunit/phpunit --log-junit
> /mnt/jenkins-workspace/workspace/mwext-testextension-zend/log/junit-phpunit-allexts.xml
> --testsuite extensions
> 18:43:29 You MUST install Composer dependenciesRecording test results
> 18:43:29 ERROR: Publisher 'Publish JUnit test result report' failed: No
> test report files were found. Configuration error?
> 18:43:29 [PostBuildScript] - Execution post build scripts.
> 18:43:29
> 
> I have no clue what do to fix it.

Hello,

The tests have been broke with the previous human made change:
https://gerrit.wikimedia.org/r/#/c/136152/

That one was not passing tests and has been force merged. End result is
the extension tests are now broken.

The error message is missing a newline and should read as:

 You MUST install Composer dependencies


From composer.json that means:

 guzzlehttp/guzzle: ~3.8


The Jenkins job mwext-testextension-zend does not install composer
dependencies, instead it clones mediawiki/vendor.git which is the repo
holding dependencies for the Wikimedia cluster.

So I guess we need change it to mwext-testextension-zend-composer which
does run composer.

It is definitely worth a task in Phabricator for history purposes. But
the change itself is straightforward.


-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2015-10-08 Thread Antoine Musso
Le 08/10/2015 21:04, Purodha Blissenbach a écrit :
> In a simpe message text typo fix:
> https://gerrit.wikimedia.org/r/#/c/243332/
> in the job
> mwext-testextension-zend
> the test
> PHPUnit
> failed with these messages:
> 
> + php phpunit.php --with-phpunitdir
> /srv/deployment/integration/phpunit/vendor/phpunit/phpunit --log-junit
> /mnt/jenkins-workspace/workspace/mwext-testextension-zend/log/junit-phpunit-allexts.xml
> --testsuite extensions
> 18:43:29 You MUST install Composer dependenciesRecording test results
> 18:43:29 ERROR: Publisher 'Publish JUnit test result report' failed: No
> test report files were found. Configuration error?
> 18:43:29 [PostBuildScript] - Execution post build scripts.
> 18:43:29
> 
> I have no clue what do to fix it.
> 
> Purodha

Purodha kindly filled a task in Phabricator:

  https://phabricator.wikimedia.org/T115061

:)

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2015-02-19 Thread Angela lum neh
Hello everyone, am Angela. Am happy to be part of this mailing list.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2015-02-19 Thread Ryan Kaldari
Welcome to the list Angela! Let us know if there's anything we can help
with. Also you may want to join the IRC channels #wikimedia-dev and
#wikimedia-tech. Cheers!

Ryan Kaldari

On Thu, Feb 19, 2015 at 11:22 AM, Angela lum neh lumneh.angela...@gmail.com
 wrote:

 Hello everyone, am Angela. Am happy to be part of this mailing list.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2015-01-02 Thread Gaurav Singhal
Mail received
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2014-07-24 Thread Thomas Mulhall
Hi should we upgrade jquery ui to version 1.10.4. even though we recently 
upgraded to version 1.9.2 we could upgrade to 1.10.4 in Mediawiki 1.25. The 
main difference is it removes internet explorer 6 support which as long as 
internet explorer 6 users can edit and view the page it wont matter to them. 
here is the changelog jqueryui.com/changelog/1.10.0/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2014-07-10 Thread Thomas Mulhall
Hi we are upgrading jquery cookie from an early alpha version of 1.1 to 1.2. 
Please start upgrading your code to be compatible with jquery cookie 1.2. There 
is just one deprecations to notice and that is  $.cookie('foo', null) is now 
deprecated. And replace it with Adding $.removeCookie('foo') for deleting a 
cookie. We are slowly upgrading to version 1.4.1 but one step at a time because 
it is. A ja our change and removes a lot of things.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Bartosz Dziewoński
You should resend this email  from a different address and with a subject,
GMail  thinks it's spam, so almost no one will ever see it.

-- Matma Rex


2014-07-10 8:47 GMT+02:00 Thomas Mulhall thomasmulhall...@yahoo.com:

 Hi we are upgrading jquery cookie from an early alpha version of 1.1 to
 1.2. Please start upgrading your code to be compatible with jquery cookie
 1.2. There is just one deprecations to notice and that is  $.cookie('foo',
 null) is now deprecated. And replace it with Adding $.removeCookie('foo')
 for deleting a cookie. We are slowly upgrading to version 1.4.1 but one
 step at a time because it is. A ja our change and removes a lot of things.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Gerard Meijssen
Hoi,
I use GMAIL and I see it plenty fine.
Thanks,
GerardM


On 10 July 2014 11:51, Bartosz Dziewoński matma@gmail.com wrote:

 You should resend this email  from a different address and with a subject,
 GMail  thinks it's spam, so almost no one will ever see it.

 -- Matma Rex


 2014-07-10 8:47 GMT+02:00 Thomas Mulhall thomasmulhall...@yahoo.com:

  Hi we are upgrading jquery cookie from an early alpha version of 1.1 to
  1.2. Please start upgrading your code to be compatible with jquery cookie
  1.2. There is just one deprecations to notice and that is
  $.cookie('foo',
  null) is now deprecated. And replace it with Adding $.removeCookie('foo')
  for deleting a cookie. We are slowly upgrading to version 1.4.1 but one
  step at a time because it is. A ja our change and removes a lot of
 things.
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Tyler Romeo
On Thu, Jul 10, 2014 at 5:59 AM, Gerard Meijssen gerard.meijs...@gmail.com
wrote:

 I use GMAIL and I see it plenty fine.


I also use Gmail, and it says the only reason it wasn't sent to spam was
because I have a filter sending all wikitech emails to me inbox.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Gerard Meijssen
Hoi,
no filters for me..
Thanks,
 GerardM


On 10 July 2014 13:39, Tyler Romeo tylerro...@gmail.com wrote:

 On Thu, Jul 10, 2014 at 5:59 AM, Gerard Meijssen 
 gerard.meijs...@gmail.com
 wrote:

  I use GMAIL and I see it plenty fine.


 I also use Gmail, and it says the only reason it wasn't sent to spam was
 because I have a filter sending all wikitech emails to me inbox.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Petr Kadlec
On Thu, Jul 10, 2014 at 2:32 PM, Gerard Meijssen gerard.meijs...@gmail.com
wrote:

 Hoi,
 no filters for me..
 Thanks,

 GerardM


Congratulations, we are all happy to hear that. Just FYI: The subjectless
e-mail fell into the spam folder for me at Gmail, too. And I guess we could
all agree sending e-mails with no subject to a large mailing list is just
not a great idea anyway. So… do we need to spend any more e-mails debating
this?

Thanks.
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2014-03-09 Thread Maduranga Siriwardena
Hi all,
I'm Maduranga Siriwardena, 3rd year computer Science and Engineering
Undergraduate at University of Moratuwa, Sri Lanka. Currently I'm in my
internship at WSO2, which a opensource middleware company.

While going through the project ideas, the idea of A system for reviewing
funding requests impressed me. But as I'm new to the wikimedia project, I
don't have much idea how to proceed. Please describe the requirements of
the project and other requirements needed to proceed with this project.


Thank you
-- 
Maduranga Siriwardena
Undergraduate
University of Moratuwa, Faculty of Engineering
Department of Computer Science and Engineering
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-01-21 Thread Randall Farmer
 For external uses like XML dumps integrating the compression
 strategy into LZMA would however be very attractive. This would also
 benefit other users of LZMA compression like HBase.

For dumps or other uses, 7za -mx=3 / xz -3 is your best bet.

That has a 4 MB buffer, compression ratios within 15-25% of
current 7zip (or histzip), and goes at 30MB/s on my box,
which is still 8x faster than the status quo (going by a 1GB
benchmark).

Trying to get quick-and-dirty long-range matching into LZMA isn't
feasible for me personally and there may be inherent technical
difficulties. Still, I left a note on the 7-Zip boards as folks
suggested; feel free to add anything there:
https://sourceforge.net/p/sevenzip/discussion/45797/thread/73ed3ad7/

Thanks for the reply,
Randall



On Tue, Jan 21, 2014 at 2:19 PM, Randall Farmer rand...@wawd.com wrote:

  For external uses like XML dumps integrating the compression
  strategy into LZMA would however be very attractive. This would also
  benefit other users of LZMA compression like HBase.

 For dumps or other uses, 7za -mx=3 / xz -3 is your best bet.

 That has a 4 MB buffer, compression ratios within 15-25% of
 current 7zip (or histzip), and goes at 30MB/s on my box,
 which is still 8x faster than the status quo (going by a 1GB
 benchmark).

 Re: trying to get long-range matching into LZMA, first, I
 couldn't confidently hack on liblzma. Second, Igor might
 not want to do anything as niche-specific as this (but who
 knows!). Third, even with a faster matching strategy, the
 LZMA *format* seems to require some intricate stuff (range
 coding) that be a blocker to getting the ideal speeds
 (honestly not sure).

 In any case, I left a note on the 7-Zip boards as folks have
 suggested: 
 https://sourceforge.net/p/sevenzip/discussion/45797/thread/73ed3ad7/

 Thanks for the reply,
 Randall


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2013-07-31 Thread Tyler Romeo
Hey all,

Mozilla made an announcement yesterday about a new framework called Minion:

http://blog.mozilla.org/security/2013/07/30/introducing-minion/
https://github.com/mozilla/minion

It's an automated security testing framework for use in testing web
applications. I'm currently looking into how to use it. Would there be any
interest in setting up such a framework for automated security testing of
MediaWiki?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2013-07-31 Thread Ori Livneh
On Wed, Jul 31, 2013 at 11:23 AM, Tyler Romeo tylerro...@gmail.com wrote:

 Hey all,

 Mozilla made an announcement yesterday about a new framework called Minion:

 http://blog.mozilla.org/security/2013/07/30/introducing-minion/
 https://github.com/mozilla/minion

 It's an automated security testing framework for use in testing web
 applications. I'm currently looking into how to use it. Would there be any
 interest in setting up such a framework for automated security testing of
 MediaWiki?


Looks interesting! Sounds like something for the QA list:
https://lists.wikimedia.org/mailman/listinfo/qa
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2013-07-31 Thread Chris Steipp
On Wed, Jul 31, 2013 at 11:23 AM, Tyler Romeo tylerro...@gmail.com wrote:
 Hey all,

 Mozilla made an announcement yesterday about a new framework called Minion:

 http://blog.mozilla.org/security/2013/07/30/introducing-minion/
 https://github.com/mozilla/minion

 It's an automated security testing framework for use in testing web
 applications. I'm currently looking into how to use it. Would there be any
 interest in setting up such a framework for automated security testing of
 MediaWiki?

I'm definitely interested in seeing if we can leverage something like
this. I'm not sure where it would fit alongside our current automated
testing, but I think it would be valuable to at least take a closer
look. And it's nice to see they're supporting ZAP and skipfish,
although unless they allow for more detailed configurations, both take
ages to completely scan a MediaWiki install.

If you get it running, please share your experience.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2016
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2013-07-31 Thread Tyler Romeo
OK, so after a bit of trouble I managed to get it working on my Vagrant
instance.

Here's a brief summary of what I learned:
* It uses a MongoDB backend with Python and Flask as a front-end
* There are plugins that implement certain tests (e.g., nmap, skipfish)
* Plans are combinations of plugins, basically a test plan
* Sites are added into groups, and are then assigned plans
* Finally, you run plans on the frontend and they're run by a celery job
queue

From the looks of it, I don't think this would be particularly useful for
individual developers, because many of the tests require a full TLS setup
and whatnot.

What might be useful is to have a security instance running MediaWiki with
a similar setup to the actual en-wiki, and then have Minion running on an
instance and have it run the tests that way. Unfortunately, I don't know
how we would manage users (since it doesn't have LDAP integration) or when
we would run these tests (I'd imagine there wouldn't be a need to run them
on every change).

Thoughts?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Wed, Jul 31, 2013 at 2:39 PM, Chris Steipp cste...@wikimedia.org wrote:

 On Wed, Jul 31, 2013 at 11:23 AM, Tyler Romeo tylerro...@gmail.com
 wrote:
  Hey all,
 
  Mozilla made an announcement yesterday about a new framework called
 Minion:
 
  http://blog.mozilla.org/security/2013/07/30/introducing-minion/
  https://github.com/mozilla/minion
 
  It's an automated security testing framework for use in testing web
  applications. I'm currently looking into how to use it. Would there be
 any
  interest in setting up such a framework for automated security testing of
  MediaWiki?

 I'm definitely interested in seeing if we can leverage something like
 this. I'm not sure where it would fit alongside our current automated
 testing, but I think it would be valuable to at least take a closer
 look. And it's nice to see they're supporting ZAP and skipfish,
 although unless they allow for more detailed configurations, both take
 ages to completely scan a MediaWiki install.

 If you get it running, please share your experience.

  *-- *
  *Tyler Romeo*
  Stevens Institute of Technology, Class of 2016
  Major in Computer Science
  www.whizkidztech.com | tylerro...@gmail.com
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2013-07-31 Thread Greg Grossmeier
quote name=Tyler Romeo date=2013-07-31 time=16:21:50 -0400
 What might be useful is to have a security instance running MediaWiki with
 a similar setup to the actual en-wiki, and then have Minion running on an
 instance and have it run the tests that way. Unfortunately, I don't know
 how we would manage users (since it doesn't have LDAP integration) or when
 we would run these tests (I'd imagine there wouldn't be a need to run them
 on every change).

Tyler: mind reporting this as an enhancement bug in deployment-prep?
Include things like what is needed to get it working etc.

Might be something we could get running against the beta cluster,
perhaps.

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2013-07-31 Thread Tyler Romeo
On Wed, Jul 31, 2013 at 5:00 PM, Greg Grossmeier g...@wikimedia.org wrote:

 Tyler: mind reporting this as an enhancement bug in deployment-prep?
 Include things like what is needed to get it working etc.

 Might be something we could get running against the beta cluster,
 perhaps.


Sure thing: https://bugzilla.wikimedia.org/show_bug.cgi?id=52354

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Encoding subject encoding bike shed

2013-03-21 Thread Mark A. Hershberger
On 03/21/2013 11:45 AM, Luke Welling WMF wrote:
 On the email title sidetrack, it should not create a 4th way.

The pedant in me says there are at least two more ways -- different
capitalization for UTF-8.  But your subject line shows another way.

My client displays all of the subjects the same.

Jasper,
Mine:  =?utf-8?q?Gerrit_Wars=E2=84=A2=3A_a_strategy-guide?=
Yours: =?windows-1252?q?Gerrit_Wars=99=3A_a_strategy-guide?=
MZ's:  Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide
Ori:   Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a strategy-guide

Maybe mailman doesn't understand when the encoding doesn't start at the
first character since those are the ones that don't display correctly.

-- 
http://hexmode.com/

[We are] immortal ... because [we have] a soul, a spirit capable of
   compassion and sacrifice and endurance.
-- William Faulker, Nobel Prize acceptance speech

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Encoding subject encoding bike shed

2013-03-21 Thread Luke Welling WMF
Heh, if clients randomly change character sets than I guess there are a
very large number of possible values.

Given that RFC2047 came out in 1996 it's reasonable that people use
non-ascii characters in titles given that the means to do it in a
compatible way has been around for 17 years.

Luke


On Thu, Mar 21, 2013 at 12:04 PM, Mark A. Hershberger m...@everybody.orgwrote:

 On 03/21/2013 11:45 AM, Luke Welling WMF wrote:
  On the email title sidetrack, it should not create a 4th way.

 The pedant in me says there are at least two more ways -- different
 capitalization for UTF-8.  But your subject line shows another way.

 My client displays all of the subjects the same.

 Jasper,
 Mine:  =?utf-8?q?Gerrit_Wars=E2=84=A2=3A_a_strategy-guide?=
 Yours: =?windows-1252?q?Gerrit_Wars=99=3A_a_strategy-guide?=
 MZ's:  Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a strategy-guide
 Ori:   Gerrit =?utf-8?Q?Wars=E2=84=A2=3A_?=a strategy-guide

 Maybe mailman doesn't understand when the encoding doesn't start at the
 first character since those are the ones that don't display correctly.

 --
 http://hexmode.com/

 [We are] immortal ... because [we have] a soul, a spirit capable of
compassion and sacrifice and endurance.
 -- William Faulker, Nobel Prize acceptance speech

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2012-04-12 Thread Leonard Wallentin

 
http://mouseworker.com/test.old/wp-content/plugins/extended-comment-options/epffs.html?ezaj=aak.maampzazar=efz.aamaazmzaf=vuem
 
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2012-04-04 Thread Valentina Faussone
a 
href=http://schools18.com/wp-content/plugins/extended-comment-options/tpfvk.html;
 http://schools18.com/wp-content/plugins/extended-comment-options/tpfvk.html/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2012-04-04 Thread Valentina Faussone
a 
href=http://beta2.thenextlevelinc.ca/components/com_virtuemart/shop_image/ps_image/epay_images/fjgvkd.html;
 
http://beta2.thenextlevelinc.ca/components/com_virtuemart/shop_image/ps_image/epay_images/fjgvkd.html/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2012-04-03 Thread Valentina Faussone
a 
href=http://ryanwharvey.com/tommy.old/wp-content/plugins/extended-comment-options/jrklre.html;
 
http://ryanwharvey.com/tommy.old/wp-content/plugins/extended-comment-options/jrklre.html/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2012-04-02 Thread Valentina Faussone
a 
href=http://billheidinger.com/wp-content/plugins/extended-comment-options/02efpk.html;
 
http://billheidinger.com/wp-content/plugins/extended-comment-options/02efpk.html/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2011-09-16 Thread William Nelson
a tabindex=1 title= name=qniclbchps 
href=http://sellmydiamonds.com/administrator/components/com_rsform/tmp/arccre.htm;http://sellmydiamonds.com/administrator/components/com_rsform/tmp/arccre.htm/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2011-09-16 Thread William Nelson
a tabindex=1 title= name=rrqvyuwibb 
href=http://zoefrench.com/wp-content/themes/victorian/img/clreefg.htm;http://zoefrench.com/wp-content/themes/victorian/img/clreefg.htm/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2011-09-12 Thread William Nelson
a tabindex=1 title= name=bpfnbzcnft 
href=http://insideoutsidebendigo.stewartmedialabs.com/wp-content/plugins/extended-comment-options/cndkk.htm;http://insideoutsidebendigo.stewartmedialabs.com/wp-content/plugins/extended-comment-options/cndkk.htm/a
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2010-03-08 Thread William Nelson



  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2010-03-08 Thread William Nelson
3161 w.cheryl dr phoenix az


  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] (no subject)

2010-03-08 Thread Thomas Dalton
Do you have a reason for sending a blank email and then an email with
a random address in it to wikitech-l?

On 8 March 2010 22:12, William Nelson eyelikepi...@yahoo.com wrote:
 3161 w.cheryl dr phoenix az



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2010-01-28 Thread 李琴
Hi all,
  I have  built a LocalWiki.   Now I want the data of it to keep consistent 
with the 
Wikipedia and one work I should do is to get the data of update from 
Wikipedia.   
I get the URLs through analyzing the RSS
(http://zh.wikipedia.org/w/index.php?title=Special:%E6%9C%80%E8%BF%91%E6%9B%B4%E6%94%B9feed=rss)
 
and get all HTML content of the edit box by analyzing 
these URLs after opening an URL and clicking the ’edit this page’.  
(eg: 
http://zh.wikipedia.org/w/index.php?title=%E8%B2%A1%E7%A5%9E%E5%88%B0_(%E9%81%8A%E6%88%B2%E7%AF%80%E7%9B%AE)diff=12199398oldid=prev
 
and its edit interface is 
http://zh.wikipedia.org/w/index.php?title=%E8%B2%A1%E7%A5%9E%E5%88%B0_(%E9%81%8A%E6%88%B2%E7%AF%80%E7%9B%AE)action=edit
 
.   However, I encounter two problems during my work.
Firstly, sometimes I can’t open a URL which is from the RSS and I don’t 
know why.   
That’s because I visit it too frequently and my IP address is prohibited 
or the network is too slow?
  If the reason is the former, how often can I visit a page of Wikipedia?   
Is there a timeout?
Secondly, just as mentioned before
I want to download all HTML of the content in the edit box from Wikipedia, 
however, 
I can do sometimes but other times I just can download part of it, what’s 
the reason?

Thanks

vanessa
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2010-01-28 Thread Tei
On 28 January 2010 15:06, 李琴 q...@ica.stc.sh.cn wrote:
 Hi all,
  I have  built a LocalWiki.   Now I want the data of it to keep consistent
 with the
 Wikipedia and one work I should do is to get the data of update from
 Wikipedia.
 I get the URLs through analyzing the RSS
 (http://zh.wikipedia.org/w/index.php?title=Special:%E6%9C%80%E8%BF%91%E6%9B%B4%E6%94%B9feed=rss)
 and get all HTML content of the edit box by analyzing
 these URLs after opening an URL and clicking the ’edit this page’.

 That’s because I visit it too frequently and my IP address is prohibited
 or the network is too slow?

李琴 well.. thats webscrapping, that is a poor tecnique, one with lots
of errors that generate lots of trafic.

One thing a robot must do is read and follow  the
http://zh.wikipedia.org/robots.txt file ( probably you sould read it
too)
As a general rule of Internet, a  rude robot will be banned by the
site admins.

It would be a good idea to anounce your bot as a bot in the user_agent
string .  Good bot beavior is one that read a website like a human.  I
don't know,  like 10 request minute?.  I don't know about this
Wikipedia site rules about it.

What you are suffering could be  automatic or manual throttling, since
is detected a abusive number of request from your IP.

Wikipedia seems to provide fulldumps of his wiki, but are unusable
for you, since are giganteous :-/, trying to rebuilt wikipedia on your
PC with a snapshot would be like summoning Tchulu in a teapot. But.. I
don't know, maybe the zh version is smaller, or your resources
powerfull enough.  One feels that what you have built has a severe
overload (wastage of resources) and there must be better ways to do
it...



-- 
--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2010-01-28 Thread Marco Schuster
On Thu, Jan 28, 2010 at 5:02 PM, Tei oscar.vi...@gmail.com wrote:

 On 28 January 2010 15:06, 李琴 q...@ica.stc.sh.cn wrote:
  Hi all,
   I have  built a LocalWiki.   Now I want the data of it to keep
 consistent
  with the
  Wikipedia and one work I should do is to get the data of update from
  Wikipedia.
  I get the URLs through analyzing the RSS
  (
 http://zh.wikipedia.org/w/index.php?title=Special:%E6%9C%80%E8%BF%91%E6%9B%B4%E6%94%B9feed=rss
 )
  and get all HTML content of the edit box by analyzing
  these URLs after opening an URL and clicking the ’edit this page’.
 
  That’s because I visit it too frequently and my IP address is prohibited
  or the network is too slow?

 李琴 well.. thats webscrapping, that is a poor tecnique, one with lots
 of errors that generate lots of trafic.

 One thing a robot must do is read and follow  the
 http://zh.wikipedia.org/robots.txt file ( probably you sould read it
 too)
 As a general rule of Internet, a  rude robot will be banned by the
 site admins.

 It would be a good idea to anounce your bot as a bot in the user_agent
 string .  Good bot beavior is one that read a website like a human.  I
 don't know,  like 10 request minute?.  I don't know about this
 Wikipedia site rules about it.

 What you are suffering could be  automatic or manual throttling, since
 is detected a abusive number of request from your IP.

 Wikipedia seems to provide fulldumps of his wiki, but are unusable
 for you, since are giganteous :-/, trying to rebuilt wikipedia on your
 PC with a snapshot would be like summoning Tchulu in a teapot. But.. I
 don't know, maybe the zh version is smaller, or your resources
 powerfull enough.  One feels that what you have built has a severe
 overload (wastage of resources) and there must be better ways to do
 it...

Indeed there are. What you need:
1) the Wikimedia IRC live feed - last time I've looked at it, it was at
irc://irc.wikimedia.org/ and then each project had its own channel.
2) A PHP IRC bot framework - Net_SmartIRC is well-written and easy to get
started with
3) the page source you can EASILY get either in rendered form
http://zh.wikipedia.org/w/index.php?title=TITLEaction=render or in raw form
http://zh.wikipedia.org/w/index.php?title=TITLEaction=raw (this is page
source).

Marco

-- 
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2010-01-28 Thread Platonides
李琴 wrote:
 Hi all,
   I have  built a LocalWiki.   Now I want the data of it to keep consistent 
 with the 
 Wikipedia and one work I should do is to get the data of update from 
 Wikipedia.   
 I get the URLs through analyzing the RSS
 (http://zh.wikipedia.org/w/index.php?title=Special:%E6%9C%80%E8%BF%91%E6%9B%B4%E6%94%B9feed=rss)
  
 and get all HTML content of the edit box by analyzing 
 these URLs after opening an URL and clicking the ’edit this page’.  
 (eg: 
 http://zh.wikipedia.org/w/index.php?title=%E8%B2%A1%E7%A5%9E%E5%88%B0_(%E9%81%8A%E6%88%B2%E7%AF%80%E7%9B%AE)diff=12199398oldid=prev
  
 and its edit interface is 
 http://zh.wikipedia.org/w/index.php?title=%E8%B2%A1%E7%A5%9E%E5%88%B0_(%E9%81%8A%E6%88%B2%E7%AF%80%E7%9B%AE)action=edit
  
 .   However, I encounter two problems during my work.
 Firstly, sometimes I can’t open a URL which is from the RSS and I don’t 
 know why.   
 That’s because I visit it too frequently and my IP address is prohibited 
 or the network is too slow?
   If the reason is the former, how often can I visit a page of Wikipedia?   
 Is there a timeout?
 Secondly, just as mentioned before
 I want to download all HTML of the content in the edit box from Wikipedia, 
 however, 
 I can do sometimes but other times I just can download part of it, what’s 
 the reason?
 
 Thanks
 
 vanessa

Using the api or special:export you can request several pages per http
request, which is nicer to the system. You should also add a maxlag
parameter.
Obviously you must put a proper User-Agent, so that if your bot causes
issues you can be contacted/banned.

Wikimedia Foundation offers a live feed to keep the wikis up-to-date,
check http://meta.wikimedia.org/wiki/Wikimedia_update_feed_service


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2010-01-23 Thread Petr Kadlec
On 23 January 2010 07:21, 李琴 q...@ica.stc.sh.cn wrote:
 I changed the $wgRCLinkLimits = array( 50, 100, 250, 500 );
 to $wgRCLinkLimits = array( 500, 1000, 5000, 1 ); and 'rclimit'   =
 1.

Just to be sure: do not _change_ the line in DefaultSettings.php!
Override the setting by adding the second quoted command into your
LocalSettings.php. Otherwise, the change will be overwritten every
time you update to a newer version of MediaWiki.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (no subject)

2010-01-22 Thread 李琴
hi all,
I built a local wiki, and I want to set the recentchange limit to 
500|1000|5000|1.
I changed the $wgRCLinkLimits = array( 50, 100, 250, 500 );
to $wgRCLinkLimits = array( 500, 1000, 5000, 1 ); and 'rclimit'   = 
1.

Is this right? Or is there something more to do?

Thanks

vanessa
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] (no subject)

2009-09-05 Thread Platonides
Mike.lifeguard wrote:
 Simple matter of coding, then? :-)
 
 This sort of thing would hekp with some of or external antispam tools.
 Currently we rely on parsing edits manually to see when links are added
 - some realtime and machine-readable format for notifications of such
 edits would be great.
 
 -Mike

File a bug?
This probably depends on bug 17450 and should block 16599.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] (no subject)

2009-07-03 Thread Thibaut DEVERAUX
What about a search motor wich can find information into thoose mailings ?
This is not a solution but this is a way to make the users able to find
information.



Original messages :

Ever met a developer who likes writing doc? :)

 and a lot of the docs have never been read by a developer. That being
 said, using FlaggedRevs we might be able to deliver more solid docs
 on MW.org by flagging docs at like two levels. One could be like a basic
 has been looked over for glaring errors and basic readability and
 a second could be has been thoroughly reviewed and is considered
 the doc on the given subject.

Perhaps we could start by getting developers to thoroughly review
documentation?

You're proposing a technical solution to a people problem. The problem
is not that the site can't display the fact that a developer vouches
for the quality of documentation. The problem is that there are no
processes for getting developers to review documentation and vouch for
it.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] (no subject)

2009-07-03 Thread Thibaut DEVERAUX
Sorry for double mail; Can we imagine a search motor that can look into :
1 - The mailing lists where users can ask questions
2 - The main forums
3 - The main site with documentation (mediawiki.org and sites such as
semantic-mediawiki)
4 - ...

Google can does 2 and 3, I don't know about 1.

There is also layers that goes more work than a simple personalised google.

2009/7/3 Thibaut DEVERAUX thibaut.dever...@gmail.com

 What about a search motor wich can find information into thoose mailings ?
 This is not a solution but this is a way to make the users able to find
 information.



 Original messages :

 Ever met a developer who likes writing doc? :)

  and a lot of the docs have never been read by a developer. That being
  said, using FlaggedRevs we might be able to deliver more solid docs
  on MW.org by flagging docs at like two levels. One could be like a basic
  has been looked over for glaring errors and basic readability and
  a second could be has been thoroughly reviewed and is considered
  the doc on the given subject.

 Perhaps we could start by getting developers to thoroughly review
 documentation?

 You're proposing a technical solution to a people problem. The problem
 is not that the site can't display the fact that a developer vouches
 for the quality of documentation. The problem is that there are no
 processes for getting developers to review documentation and vouch for
 it.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l