Re: [Wikisource-l] Do we have tools for offline collaboration?

2018-03-25 Thread Yann Forget
FYI, Zoé on the French Wikisource works offline, and then copy-paste the
proofread text back to Wikisource.
Seeing the result, she has quite a good process, fast and good quality.
You might want to ask her how she works:
https://fr.wikisource.org/wiki/Sp%C3%A9cial:Contributions/Zo%C3%A9

Regards,

Yann


2018-03-24 20:28 GMT+05:30 mathieu stumpf guntz <
psychosl...@culture-libre.org>:

> Hello,
>
> A person in a local Wikisource workshop asked me if we could download all
> material of a specific work to proofread it offline. So download both the
> pictures and the OCRed text. Additionaly I think it would be good to
> provide tool to at least have side by side plain text and pictures.
>
> So, are you aware of anything close to such a tool? :)
>
> Cheers
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>


-- 
Jai Jagat 2020 Grand March Coordinator
https://www.jaijagat2020.org/
+91-62 60 140 319
+91-74 34 93 33 58
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] 200-Year-Old Journal Reveals Rare American Sunspot Records

2017-12-08 Thread Yann Forget
https://amp.space.com/39021-200-year-old-journal-rare-sunspot-records.html

This  would be a nice addition to Wikisource.

Yann
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] quickstatements for missing editions

2017-11-02 Thread Yann Forget
Hi,

Wikidata is a list of facts, so I don't see how any other license would be
appropriate.
It is similar to copyright on patents and facts, as some commercial
entities have tried to impose. I am obviously against that.
I am also against copyright restrictions for databases, as they existed in
some countries, so I can't advocate puting one on Wikidata.
The objective, like the rest of Wikimedia, is allowing the widest possible
use of knowledge. CC0 for Wikidata fits quite well with this objective.

Regards,

Yann

2017-11-01 14:39 GMT+01:00 mathieu stumpf guntz <
psychosl...@culture-libre.org>:

> Hey everyone,
>
> I seize the opportunity of this planed import to make you aware that I
> started a project research on Wikiversity about Wikidata and its license :
>
> https://fr.wikiversity.org/wiki/Recherche:La_licence_CC-
> 0_de_Wikidata,_origine_du_choix,_enjeux,_et_prospections_sur_les_aspects_
> de_gouvernance_communautaire_et_d%E2%80%99%C3%A9quit%C3%A9_contributive
>
> Admittedly, a driving force behind the launch of this project is an
> intuitive aversion against CC0, and the will of the Wikidata team to launch
> their lexicological solution with, without, or even against Wiktionary
> communities. But as intuition is never as useful as when feeding hypotheses
> of rational inquiry whose conclusions might reject it, I thought preferable
> to make such a research project so I could stand on a firmer vision of
> implications of this choice.
>
> Also, whatever one might ethically feel about this topic is one thing,
> legal issues is a really different matter. So far, I have didn't found any
> evidence of a serious inquiry of letting people make mass import of data
> within Wikidata. But hopefully I'll soon be given links showing such an
> inquiry was indeed performed. Not requiring source and evidence of a free
> license covering imported data is a great way to put at risk of massive
> legal infraction, not only the Wikidata project, but anyone who reuse its
> data.
>
> I welcome any source that you might judge valuable for the research
> project evoked above. That is anything speaking about how the license was
> chose, opinion of the community regarding this choice, ins and outs of the
> reuse ability, what notable partnership was ease or prevented due to the
> license, how external reusers do or do not give back through one form
> (better curated or enlarged set of data), or an other (technical advises,
> institutional promotion, funds…), and anything you think worth mentioning
> regarding Wikidata license. It would be kind to check it is not already in
> the list of sources I fetched so far, see https://fr.wikiversity.org/
> wiki/Recherche:La_licence_CC-0_de_Wikidata,_origine_du_choix,_enjeux,_et_
> prospections_sur_les_aspects_de_gouvernance_communautaire_
> et_d%E2%80%99%C3%A9quit%C3%A9_contributive/Wikidata_:_les_
> origines_du_choix_de_CC-0#Notes_et_r.C3.A9ferences
>
> Also let me know if you would be interested with a translation. So far I'm
> writing it in my native language to hasten the draft outcome and I don't
> necessarily expect huge interest for the subject beyond myself. But if
> people show interest, or even would like to contribute, I can switch to
> Esperanto, or even the less likely demand of an English version. ;)
>
> Inquirely,
> psychoslave
>
> Le 31/10/2017 à 16:14, Thomas Pellissier Tanon a écrit :
>
> Hello Sam,
>
> Thank you for this nice feature!
>
> I have created a few months ago a prototype of Wikisource to Wikidata 
> importation tool for the French Wikisource based on the schema.org annotation 
> I have added to the main header template (I definitely think we should move 
> from our custom microformat to this schema.org markup that could be much more 
> structured). It's not yet ready but I plan to move it forward in the coming 
> weeks. A beginning of frontend to add to your Wikidata common.js is here: 
> https://www.wikidata.org/wiki/User:Tpt/ws2wd.js
> We should probably find a way to merge the two projects.
>
> Cheers,
>
> Thomas
>
>
> Le 31 oct. 2017 à 15:10, Nicolas VIGNERON  
>  a écrit :
>
> 2017-10-31 13:16 GMT+01:00 Jane Darnell  
> :
> Sorry, I am much more of a Wikidatan than a Wikisourcerer! I was referring to 
> items like this onehttps://www.wikidata.org/wiki/Q21125368
>
> No need to be sorry, that is actually a good question and this example is 
> even better (I totally forgot this kind of case).
>
> For now, this is probably better to deal with it by hands (and I'm not sure 
> what this tools can even do for this).
>
> Cdlt, ~nicolas
> ___
> Wikisource-l mailing 
> listWikisource-l@lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
>
> ___
> Wikisource-l mailing 
> 

Re: [Wikisource-l] IA Upload queue

2017-09-12 Thread Yann Forget
Hi,

The upload limit on Commons is 4 GB, so there should be any issue.
If the script can't upload files bigger than 100 MB, it should be fixed.
See how other tools work, as an example (i.e. video2commons).

Regards,

Yann


2017-09-12 14:03 GMT+02:00 Sam Wilson :

> Yes, this happens when the resultant DjVu file is larger than Commons will
> allow. I think 100 MB is the limit?
>
> I'm not sure how to get around this. Perhaps we resize the images smaller?
> But we don't want to do that every time, so perhaps we have to generate the
> DjVu, see how big it is, and if it's too big resize and build it again?
> Would that work?
>
> We could make the over-size DjVu available for download, and then the user
> could use a different method to upload to Commons (is there such a method?).
>
> Suggestions welcome!
>
> A related issue is https://phabricator.wikimedia.org/T161396
> I can't find a ticket yet for the request-too-large problem, but I
> remember seeing one; anyway, I'll create it again, and perhaps Community
> Tech can look into it.
>
> There's also the slight possibility that IA can start creating DjVus
> again! Which would be brilliant, but I haven't heard anything about that
> since Wikimania.
>
> —Sam.
>
>
> On Tue, 12 Sep 2017, at 04:47 PM, Ilario Valdelli wrote:
>
> [2017-08-02 08:00:49] LOG.CRITICAL: Client error: `POST 
> https://commons.wikimedia.org/w/api.php` 
>  resulted in a `413 Request Entity 
> Too Large` response:  413 Request Entity Too 
> Large  413 Request Entity 
> (truncated...)  [] []
>
>
>
> Or the program is not able to process huge files or, simply, the disk
> space is finished.
>
>
>
> Kind regards
>
>
>
> Sent from Mail  for
> Windows 10
>
>
>
> *From: *Andrea Zanni 
> *Sent: *12 September 2017 10:37
> *To: *discussion list for Wikisource, the free library
> 
> *Subject: *[Wikisource-l] IA Upload queue
>
>
>
> Dear all,
>
> someone could help understand if we have an issue here?
> https://tools.wmflabs.org/ia-upload/commons/init
>
> Some librarians uploaded books months ago,
>
> but they were never processed.
>
> Is the tool working, or it simply never signals when it fails?
>
>
>
>
> 
> Mail priva di virus. www.avast.com
> 
> <#m_2376373167041964652_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
> *___*
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Proofread status colour buttons

2017-09-08 Thread Yann Forget
Hi,

It is fixed now.

Thanks to all,

Yann

2017-09-08 1:41 GMT+02:00 Sam Wilson :

> Seems like it's a widespread problem! :-(
>
> The bug is being tracked here: https://phabricator.wikimedia.org/T175304
>
>
> On Fri, 8 Sep 2017, at 07:10 AM, Bodhisattwa Mandal wrote:
>
> Hi,
>
> The proofread status colour buttons in normal edit mode are not appearing
> in Wikisource projects for last few hours.
>
> Any idea, what's happening?
>
>
>
>
> *___*
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Scans from the Library of Geneva

2017-08-25 Thread Yann Forget
Hi, The first scans arrived:
https://commons.wikimedia.org/wiki/Category:Scans_by_the_Library_of_Geneva

The index pages are here:
https://fr.wikisource.org/wiki/Cat%C3%A9gorie:Scans_de_la_biblioth%C3%A8que_de_Gen%C3%A8ve

Rappel: this is a project supported by Wikimedia CH (Switzerland).

Regards,

Yann
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] March hangout

2017-03-24 Thread Yann Forget
Hi, Is there a log or a summary of last meeting?

Regards,

Yann

2017-03-23 1:23 GMT+01:00 Sam Wilson :

> Sunday, 26 March 2017 at 12:00 (UTC) to 13:00 (UTC)
> https://meta.wikimedia.org/wiki/Wikisource_Community_
> User_Group/March_2017_Hangout
>
> Add things to to the list of things-to-talk-about, if you want to talk
> about things. Or just come and talk about things.
>
> :-)
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] The conversion from PDF to DJVU loses too much quality

2017-01-25 Thread Yann Forget
2017-01-25 8:40 GMT+01:00 Sam Wilson :

>
> On Wed, 25 Jan 2017, at 03:27 PM, Andrea Zanni wrote:
>
> On Wed, Jan 25, 2017 at 1:45 AM, Sam Wilson  wrote:
>
>
> Yann, do you mean you're getting good quality DjVu generated from the PDF?
> Or from the original scan Jpegs?
>
> AFAIU, Yann is using ABBYY finereader to generate a djvu and then uploads
> it directly to Commons. So outside of our ia-upload tool.
>
> Ah, okay. So if it could be done in the tool, that'd be nicer.
>
> Yes, it is a question of settings.

> Aubrey: when you say directly use the PDF, you mean for the tool to copy
> that across to Commons and not create a DjVu?
>
>
> Yes.
> If the Djvu quality is much lower than the PDF there's no reason to use
> the djvu over the pdf :-(
>
> DjVu has to advantages over PDF: better compression, so small files for
the same content, and better management of the text layer.
Over if the compression is too high, the quality is not good. It is a
question of a compromise between quality and size.

Yann


> Are we saying that we *never* want to use the IA PDF? That if there's a
> DjVu we use it, and if there isn't we generate our own DjVu from the JP2
> and djvu.xml files? Or should the tool user make this call and we give them
> a drop-down list of "PDF only", "Generate DjVu from PDF", and "Generate
> DjVu from original scans" with a note about the last of these being higher
> quality but slower?
>
> I think I'm in favour of just generating a high-quality DjVu and making it
> simpler for the end user. But we want to be flexible too. jayantanth
> mentioned  that he'd like to
> be able to just upload the PDF for example.
>
>
>
>
> I can have a look at adding that feature perhaps? (Anyone else working on
> this?)
>
>
> Please ;-)
>
>
> I can try!  :-)
>
> Aubrey
> *___*
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
>
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Indic Wikisource Update November 2016

2016-11-12 Thread Yann Forget
Hi,

2016-11-03 8:36 GMT+01:00 mathieu stumpf guntz <
psychosl...@culture-libre.org>:

I guess that the "100 livres en 100 jours" (100 books in 100 days)
> challenge help somewhat. The goal is to treat a whole new book everyday. No
> anticipation work allowed. Missing the goal a single day reset the counter.
>
> Le 03/11/2016 à 01:46, Sam Wilson a écrit :
>
> Yes, I agree! :-) There're so many smallish things that I reckon can go a
> long way towards making Wikisources bigger and better.
>
> And it keeps surprising me how many people within the Wikimedia movement
> aren't familiar with how Wikisource works — and are amazed when they're
> shown! :-) It really does seem that we're not very good at advertising
> ourselves. (Well, one doesn't like to blow one's own trumpet, does one?)
>
> Talking of stats, what is French Wikisource doing that's so successful at
> getting things proofread and validated?
> https://tools.wmflabs.org/phetools/graphs/Wikisource_-_
> proofread_pages_per_day.png
> https://tools.wmflabs.org/phetools/statistics.php?diff=30
>
> —sam
>
> Yes, the 100 books in 100 days challenge helps a bit, but growth comes
mainly from Zoé, who corrects all volumes of the "Revue des Deux Mondes",
and the partnership with the Bibliothèque et Archives nationales du Québec
(Quebec National Archives and Library), due to the leadership of Ernest.
See https://fr.wikisource.org/wiki/Wikisource:BAnQ and
http://www.banq.qc.ca/activites/wiki/wiki-source.html

Regards,

Yann
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Large djvu files from IA

2016-03-06 Thread Yann Forget
Hi,

2016-03-05 19:15 GMT+01:00 Bodhisattwa Mandal :
> Hi Nemo,
>
>> Looks like they scrapped the DjVu and "Text PDF" format, I don't know
>> whether intentionally. You should ask in their forum:
>> https://archive.org/iathreads/post-new.php?forum=texts (be kind!).
>
> Just saw this forum post in IA. It seems IA is going to stop creating djvu
> files soon.
> https://archive.org/post/1053214/djvu-files-for-new-uploads

As IA is probably the biggest source of scans for Wikisource, this is worrying.
I would expect an important change like this to be announced well in
advance, but it wasn't.
So we need to rewrite the BUB tool completely from scratch.
I would also suggest uploading as many DJVU files as possible while
they are still there.

> Regards,
>
> --
> Bodhisattwa

Thanks for this important info.

Yann

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Large djvu files from IA

2016-02-26 Thread Yann Forget
Hi, Admins on Commons can upload files larger than 100 MB via
upload-by-url. There is currently a timeout issue, so the limit is not
a fixed figure.
200 MB files work fine most of the time. I even once managed once to
upload a 300 MB file. If you need help, please give a list of files on
my Commons talk page.

Regards,

Yann
https://commons.wikimedia.org/wiki/User_talk:Yann

2016-02-26 20:38 GMT+01:00 Bodhisattwa Mandal :
> Hi,
>
> Is it possible to transfer large djvu files (more than 100 mb) from Internet
> Archive to Commons? I have tried IA-upload and url2commons tool but not
> succeeded.
>
> Regards,
> --
> Bodhisattwa
>
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l
>

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] BUB is back

2016-01-08 Thread Yann Forget
Thanks a lot to the people who fixed it!
That's great!

Yann

2016-01-07 22:13 GMT+01:00 Federico Leva (Nemo) :
> FYI Rohit is being as fantastic as usual again.
> * https://archive.org/search.php?query=uploader%3Atools.bub=-publicdate
> * https://github.com/rohit-dua/BUB/pulse
>
> Nemo
>
> ___
> Wikisource-l mailing list
> Wikisource-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikisource-l

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Fwd: [Wikimediaindia-l] 20 volumes (8376 pages) of Tamil Encylopedia released under Creative Commons

2014-08-30 Thread Yann Forget
FYI. Yann

-- Forwarded message --
From: Ravishankar ravidre...@gmail.com
Date: 2014-08-30 20:10 GMT+05:30
Subject: [Wikimediaindia-l] 20 volumes (8376 pages) of Tamil
Encylopedia released under Creative Commons
To: Wikimedia India Community list wikimediaindi...@lists.wikimedia.org


Hi,

Tamil Development Board (an autonomous institution under Government of
Tamilnadu) releases its Encyclopedia (10 volumes, 7407 pages) and
Children's Encyclopedia (10 volumes, 969 pages) under Creative Commons
license. Tamil Wikipedians lead by Prof. C. R. Selvakumar and Prof. P.
R. Nakkeeran, (Director, Tamil Virtual Academy) spearheaded this
initiative coinciding with Tamil Wikipedia's 10 years celebrations.

An official confirmation (in Tamil) can be seen at

https://upload.wikimedia.org/wikipedia/commons/4/46/Letter_from_Tamil_Development_Board_donating_20_volumes_of_encyclopedia_in_Tamil_under_Creative_Commons_license.jpeg

Scanned copies of these works are already available at

http://tamilvu.org/library/kulandaikal/lku00/html/lku00ind.htm

At Tamil Wikipedia, we are discussing how we can get this content
typed and transferred to WikiSource. Doing so can be a good model to
encourage more such works to be released in public domain.

Following are two options I can think of:

1. Volunteers type all the content. Besides taking years to complete,
this won't do justice for the value of time of volunteers who can do
more valuable work than typing mechanically.

A program like IT@School present in Kerala or a contest can encourage
more people to join this effort but not all communities can't emulate
this model successfully.

2. Request WMF to give a grant to the owner of the content and let
them hand over the typed content to Wikisource volunteers who will
upload and wikify the content.

This will ensure maintaining the spirit of volunteerism and yet
getting the work done in a professional and time bound manner.

Numerous works in Wikisource are such ready made content uploaded
already in the web through other projects like Project Gutenberg.

If providing grants to non-Wikimedia organizations is an issue, a
grant towards this can be given to community / chapter who will then
outsource the typing work.

I welcome community's input on any other model for this as India has
vast amount of literature and works like this are waiting to be
transfered to Wikisource. This is one area where we can add lot of
content to Wiki projects at once.

Ravi


___
Wikimediaindia-l mailing list
wikimediaindi...@lists.wikimedia.org
To unsubscribe from the list / change mailing preferences visit
https://lists.wikimedia.org/mailman/listinfo/wikimediaindia-l

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] OCR for Persian

2014-06-27 Thread Yann Forget
Hi,

I have Abby FR 11 Professional Edition, and Persian/Farsi is not among
the supported languages. :(

Yann

2014-06-24 19:07 GMT+05:30 Amir Ladsgroup ladsgr...@gmail.com:
 Hello,
 I have access to huge resources of old books in Persian (some of them are
 even typed) and almost all of them can be imported to Wikisource but the
 problem is I don't have (or know) any OCR for Persian, Do you know which OCR
 software supports Persian (supporting Arabic is not enough; I checked
 several programs) texts?


 Best

 --
 Amir

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] A new book collection at Internet Archive

2014-03-28 Thread Yann Forget
Hi,

Very nice :
https://commons.wikimedia.org/wiki/Category:Le_Nouveau_Th%C3%A9%C3%A2tre_italien

Regards,

Yann


2014-03-28 19:40 GMT+05:30 Alex Brollo alex.bro...@gmail.com:

 Our itsource tool is fastly uploading many books into Internet Archive,
 they are collected into the new collection 
 opallibriantichihttps://archive.org/collection/opallibriantichi.
 Scans come from Opal Libri 
 Antichihttp://www.opal.unito.it/psixsite/default.aspx,
 the library of University of Turin. Most of the first uploads (more then
 600 by now) come from Opal theater 
 collectionhttp://www.opal.unito.it/psixsite/Teatro%20italiano%20del%20XVI%20e%20XVII%20secolo/default.aspx?xbox=1100allbooks=truein
  Italian; some are in French (i.e. Le Nouveau Theatre Italienne, vol. I -
 VIII).

 New uploads will be added fastly; take a look sometimes if you are
 interested about ancient books and ancient editions. Their metadata are
 customed to be effectively grabbed by Tpt Internet archive to Commons
 uploader.

 Alex brollo


 ___
 Wikisource-l mailing list
 Wikisource-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikisource-l


___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] temporary patch for too high nsPage textarea

2013-12-07 Thread Yann Forget
Hi Alex,

Where do you use that?

Regards,

Yann


2013/12/7 Alex Brollo alex.bro...@gmail.com

 This is the simple script that I'm using to reduce to a comfortable size
 edit textarea in nsPage, it sniffs too layout toggling:

 function resizeBox () {if ((wgCanonicalNamespace==Page 
 (wgAction==edit || wgAction==submit))
 $(.wikiEditor-ui-left).css(width)==$(#wpTextbox1).css(width))
{$(#wpTextbox1).attr(rows,10)} else
 {$(#wpTextbox1).attr(rows,31);}
 }
 $(document).ready(function () {
 $(img[rel='toggle-layout']).attr(onclick,resizeBox());
 resizeBox();
 }
 );

 Rough, but running. :-)

 Alex


 ___
 Wikisource-l mailing list
 Wikisource-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikisource-l


___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Huge issues in namespace Page

2013-12-05 Thread Yann Forget
2013/12/5 Andrea Zanni zanni.andre...@gmail.com

 Thanks guys.
 Whas this the reason why the French Wikisource lost over 6000 pages from
 75 to 25 yesterday?
 (today it seems corrected).


I think it is due to a bug in the new version of the ProofreadPage
extension.

Next time, it would be better to have a better documentation and
 communitcation for the deployment: we can use this list for that reason,
 and single users can write in the local sitenotice about that.


Agreed.


 Aubrey


Regards,

Yann
___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Universal Library

2009-09-05 Thread Yann Forget
It concerns mainly WS. Yann

 Original Message 
Subject: Re: [Foundation-l] Universal Library
Date: Sat, 5 Sep 2009 02:29:28 -0400
From: David Goodman
To: Yann Forget

More accurately, the number of items there with adequate and complete
information comes close to zero--or perhaps is actually zero-- One
example for now, the first I looked at:

http://fr.wikisource.org/wiki/Vie_d’Alexandre_le_Grand
Plutarque traduction Ricard, 1840
There were many eds. of his translation. The 1840 is not the first ed,
which was 1798-1803, but presumably the 1837-1841 published by F.A.
Dubois, Vol. and pages not specified.
It makes a difference whether a translation was done in the 18th c. or
the 19th.

http://en.wikisource.org/wiki/Lives_(Dryden_translation)/Alexander
Lives by Plutarch , translated by John Dryden
-- but it doesn't specify the edition at all.

The French comes out ahead, but not by much.
Neither specified just what copy was used, or even what printing, a
basic necessity for checking the transcription.
IAS and Google Book Search do. Not in the metadata, unfortunately, but
they do show it in the scan. They also scan multiple copies from
multiple libraries, a basic necessity for scholarly work.  No
responsible academic would prepare a text from a single copy.

As for translation  links, the enWS links to the frWS, the frWS links
to the enWS, but incorrectly. They both link to the Greek, which gives
no indication at all of which edition is being followed. It is very
unlikely to be the one use by Ricard or the one used by Dryden.

If you want to know, I do not work for the enWS because the accepted
standards are so low I have no hope of fixing it; for the enWP I can
at least have some effect.

As for the frWP sourcing, I checked 20 articles. Half were unsourced
entirely, or to primary sources from the subject of the article only.
The frWP does an excellent job of sourcing to primary documentary
sources--much better than the en. Neither do all that well otherwise,
except for scattered articles worked on by good people.  The deWP is
the one that comes closer to adequacy.

David Goodman, Ph.D, M.L.S.
http://en.wikipedia.org/wiki/User_talk:DGG

On Thu, Sep 3, 2009 at 5:19 PM, Yann Forgety...@forget-me.net wrote:
 David Goodman wrote:
 ...
 information)  The accuracy  adequacy -- let alone completeness-- of
 the bibliographic information in WS is close to zero,

 This alone shows that you know very little of this project, where I have
 never seen you. You claim to be an expert, but you talk about things
 which you don't know. So I won't pursue this discussion, it is quite
 useless in that context.

 ...
 David Goodman, Ph.D, M.L.S.
 http://en.wikipedia.org/wiki/User_talk:DGG

 Regards,

 Yann

-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] [Foundation-l] Universal Library

2009-09-03 Thread Yann Forget
David Goodman wrote:
 I have read your proposal. I continue to be of the opinion that we are
 not competent to do this. Since the proposal  says, that this project
 requires as much database management knowledge as librarian
 knowledge, it confirms my opinion. You will never merge the data
 properly if you do not understand it.

That's all the point that it needs to be join project: database gurus
with librarians. What I see is that OpenLibrary lacks some basic
features that Wikimedia projects have since a long time (in Internet
scale): easy redirects, interwikis, mergings, deletion process, etc.
Some of these are planned for the next version of their software, but I
still feel that sometimes they try to reinvent the wheel we already have.

OL claims to have 23 million book and author entries. However many
entries are duplicates of the same edition, not to mention the same
book, so the real number of unique entries is much lower. I also see
that Wikisource has data which are not included in their database (and
certainly also Wikipedia, but I didn't really check).

 You suggest 3 practical steps
 1. an extension for finding a book in OL is certainly doable--and it
 has been done, see
 [http://en.wikipedia.org/wiki/Wikipedia:Book_sources].
 2. an OL  field,  link to WP -- as you say, this is already present.
 3. An OL field, link to Wikisource. A very good project. It will be
 they who need to do it.

Yes, but I think we should fo further than that. OpenLibrary has an API
which would allow any relevant wiki article to be dynamically linked to
their data, or that an entry could be created every time new relevant
data is added to a Wikipedia projects. This is all about avoiding
duplicate work between Wikimedia and OpenLibrary. It could also increase
accuracy by double checking facts (dates, name and title spelling, etc.)
between our projects.

 Agreed we need translation information--I think this is a very
 important priority.   It's not that hard to do a list or to add links
 that will be helpful, though not  exact enough to be relied on in
 further work.  That's probably a reasonable project, but it is very
 far from a database of all books ever published
 
 But some of this is being done--see the frWP page for Moby Dick:
 http://fr.wikipedia.org/wiki/Moby_Dick
 (though it omits a number of the translations listed in the French Union
 Catalog, 
 http://corail.sudoc.abes.fr/xslt/DB=2.1/CMD?ACT=SRCHAIKT=8063SRT=RLVTRM=Moby+Dick]
 I would however not warrant without seeing the items in hand, or
 reading an authoritative review, that they are all complete
 translations.
 The English page on the novel lists no translations;  perhaps we could
 in practice assume that the interwiki links are sufficient. Perhaps
 that could be assumed in Wiksource also?

That's another possible benefit: automatic list of
works/editions/translations in a Wikipedia article.

You could add {{OpenLibrary|author=Jules Verne|lang=English}} and you
have a list of English translations of Jules Verne's works directly
imported from their database. The problem is that, right now, Wikimedia
projects have often more accurate and more detailed information than
OpenLibrary.

 David Goodman, Ph.D, M.L.S.
 http://en.wikipedia.org/wiki/User_talk:DGG

Regards,

Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] [Foundation-l] Universal Library

2009-09-02 Thread Yann Forget
Hello, I have already answered some of these arguments earlier.

David Goodman wrote:
 Not only can the OpenLibrary do it perfect well without us.
 considering our rather inconsistent standards, they can probably do it
 better without us.  We will just get in the way.

The issue is not if OpenLibrary is doing it perfect well without us,
even if that were true. Currently what OpenLibrary does is not very
useful for Wikimedia, and partly duplicate what we do. Wikimedia has
also important assets which OL doesn't have, and therefore a
collaboration seems obviously beneficial for both.

 There is sufficient missing material in  every Wikipedia, sufficient
 lack of coverage of areas outside the primary language zone and in
 earlier periods, sufficient unsourced material; sufficient need for
 updating  articles, sufficient potentially free media to add,
 sufficient needed imagery to get;  that we have more than enough work
 for all the volunteers we are likely to get.
 
 To duplicate an existing project is particularly unproductive when the
 other project is doing it better than we are ever going to be able to.
 Yes, there are people here who could  do it or learn to do it--but I
 think everyone here with that degree of bibliographic knowledge would
 be much better occupied in sourcing articles.

It is clear that you didn't even read my proposal.
Please do before emitting objections.
http://strategy.wikimedia.org/wiki/Proposal:Building_a_database_of_all_books_ever_published

I specifically wrote that my proposal is not necessarily starting a new
project. I agree that working with Open Library is necessary for such
project, but I also say if Wikimedia gets involved, it would be much
more successful.

What you say here is completely the opposite how Wikimedia projects
work, i.e. openness, and that's just what is missing in Open Library.

 David Goodman, Ph.D, M.L.S.

Regards,
Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Universal Library

2009-09-01 Thread Yann Forget
Hello,

I started a proposal on the Strategy Wiki:
http://strategy.wikimedia.org/wiki/Proposal:Building_a_database_of_all_books_ever_published

IMO this should be a join project between Openlibrary and Wikimedia.
Both have an interest and a capacity to work on this.

Regards,

Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Open Library, Wikisource, and cleaning and translating OCR of Classics

2009-08-21 Thread Yann Forget
Lars Aronsson wrote:
 Yann Forget wrote:
 
 As I already said, the first steps would be to import existing 
 databases, and Wikimedians are very good at this job.
 
 Do you have a bibliographic database (library catalog) of French 
 literature that you can upload?  How many records?  Convincing 
 libraries to donate copies of their catalogs has been a bottleneck 
 for OpenLibrary.

No, I don't have such a database. There is a copyright on databases in
Europe, which makes things complicated.

Probably we need to start with libraries which are already collaborating
with open content projects. There was a GLAM-wiki meeting in Australia
recently: there might be a possibility with an Australian library?

But even before that, if we could extract the data from Wikimedia
projects, we could create a basic working frame. I have been collecting
such data on Wikisource and Wikibooks, but the lack of a structured
system is a bottleneck.

Examples:
1. Comprehensive bibliography of Gandhi in French
http://fr.wikibooks.org/wiki/Bibliographie_de_Gandhi

2. French translations of Russian authors:
http://fr.wikisource.org/wiki/Discussion_Auteur:L%C3%A9on_Tolsto%C3%AF
http://fr.wikisource.org/wiki/Discussion_Auteur:F%C3%A9dor_Mikha%C3%AFlovitch_Dosto%C3%AFevski

Regards,

Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] [Foundation-l] Open Library, Wikisource, and cleaning and translating OCR of Classics

2009-08-21 Thread Yann Forget
Joshua Gay wrote:
 David Strauss did a quick implementation (basically a demo) of an
 OpenLibrary extension for MediaWiki. In very little amount of code, he was
 able to easily search the OL (via AJAX) and when the user selected a given
 result, it poppulated a Citation template. What was nice is that when no
 results came up for a given search, there was an add to open library
 button that brought you to the OL site to add your bibliographic
 information.

Interesting, I didn't know that. Is this demo available somewhere?

Yann

-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Open Library, Wikisource, and cleaning and translating OCR of Classics

2009-08-18 Thread Yann Forget
Hello,

Lars Aronsson wrote:
 Yann Forget wrote:
 
 This discussion is very interesting. I would like to make a summary, so
 that we can go further.

 1. A database of all books ever published is one of the thing 
 still missing.
 
 No, no, no, this is *not* missing. This is exactly the scope of 
 OpenLibrary. Just as Wikipedia is not yet a complete encyclopedia, 
 or OpenStreetMap is not yet a complete map of the world, some 
 books are still missing from OpenLibrary's database, but it is a 
 project aiming to compile a database of every book ever published.

At least Wikipedia can say that it has the most complete encyclopedia,
and OpenStreetMap the most complete free maps that ever existed. AFAIK
OpenLibrary is very very far to have anything comprensive, through I am
curious to have the figures. As I already said, the first steps would be
to import existing databases, and Wikimedians are very good at this job.

 Personally I don't find OL very practical. May be I am too much 
 used too Mediawiki. ;oD
 
 And therefore, you would not try to improve OpenLibrary, but 
 rather start an entirely new project based on MediaWiki?  I'm 
 afraid that this (not invented here) is a common sentiment, and 
 a major reason that we will get nowhere.

You are wrong here. I was delighted to see a project as OL and I
inserted a few books and authors, but I have not been convinced. On
books and authors, Wikimedia projects have already much more data than
OL, and a lot of basic funtionalities are not available: tagging 2
entries as identical (redirect), multilinguism, links between related
entries (interwiki), etc.

I don't really care who would host this Universal Library, as long as
it is freely available with a powerful search engine, and no restriction
on reuse. What I say is that Mediawiki is really much better that
anything else for any massive online cooperative work. The most
important point for such a project is building a community. OpenLibrary
has certainly done a good job, but I don't see _a community_. The tools
and the social environment available on Wikimedia projects are missing.
I believe the social environment is a consequence both of the software
and the leadership. Once the community exists it may be self-sustaining
if other conditions are met. OL lacks a good software as Mediawiki and a
leader as Jimbo.

Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Open Library, Wikisource, and cleaning and translating OCR of Classics

2009-08-12 Thread Yann Forget
Hello,

This discussion is very interesting. I would like to make a summary, so
that we can go further.

1. A database of all books ever published is one of the thing still missing.
2. This needs massive collaboration by thousands of volunteers, so a
wiki might be appropriate, however...
3. The data needs a structured web site, not a plain wiki like Mediawiki.
4. A big part of this data is already available, but scattered on
various databases, in various languages, with various protocols, etc. So
a big part of work needs as much database management knowledge as
librarian knowledge.
5. What most missing in these existing databases (IMO) is information
about translations: nowhere there are a general database of translated
works, at least not in English and French. It is very difficult to find
if a translation exists for a given work. Wikisource has some of this
information with interwiki links between work and author pages, but for
a (very) small number of works and authors.
6. It would be best not to duplicate work on several places.

Personally I don't find OL very practical. May be I am too much used too
Mediawiki. ;oD

We still need to create something, attractive to contributors and
readers alike.

Yann

Samuel Klein wrote:
 This thread started out with a discussion of why it is so hard to
 start new projects within the Wikimedia Foundation.  My stance is
 that projects like OpenStreetMap.org and OpenLibrary.org are doing
 fine as they are, and there is no need to duplicate their effort
 within the WMF.  The example you gave was this:
 
 I agree that there's no point in duplicating existing functionality.
 The best solution is probably for OL to include this explicitly in
 their scope and add the necessary functionality.   I suggested this on
 the OL mailing list in March.
http://mail.archive.org/pipermail/ol-discuss/2009-March/000391.html
 
 *A wiki for book metadata, with an entry for every published
 work, statistics about its use and siblings, and discussion
 about its usefulness as a citation (a collaboration with
 OpenLibrary, merging WikiCite ideas)
 To me, that sounds exactly as what OpenLibrary already does (or
 could be doing in the near time), so why even set up a new project
 that would collaborate with it?  Later you added:
 
 However, this is not what OL or its wiki do now.  And OL is not run by
 its community, the community helps support the work of a centrally
 directed group.  So there is only so much I feel I can contribute to
 the project by making suggestions.  The wiki built into the fiber of
 OL is intentionally not used for general discussion.
 
 I was talking about the metadata for all books ever published,
 including the Swedish translations of Mark Twain's works, which
 are part of Mark Twain's bibliography, of the translator's
 bibliography, of American literature, and of Swedish language
 literature.  In OpenLibrary all of these are contained in one
 project.  In Wikisource, they are split in one section for English
 and another section for Swedish.  That division makes sense for
 the contents of the book, but not for the book metadata.
 
 This is a problem that Wikisource needs to address, regardless of
 where the OpenLibrary metadata goes.  It is similar to the Wiktionary
 problem of wanting some content - the array of translations of a
 single definition - to exist in one place and be transcluded in each
 language.
 
 Now you write:

 However, the project I have in mind for OCR cleaning and
 translation needs to
 That is a change of subject. That sounds just like what Wikisource
 (or PGDP.net) is about.  OCR cleaning is one thing, but it is an
 entirely different thing to set up a wiki for book metadata, with
 an entry for every published work.  So which of these two project
 ideas are we talking about?
 
 They are closely related.
 
 There needs to be a global authority file for works -- a [set of]
 universal identifier[s] for a given work in order for wikisource (as
 it currently stands) to link the German translation of the English
 transcription of OCR of the 1998 photos of the 1572 Rotterdam Codex...
 to its metadata entry [or entries].
 
 I would prefer for this authority file to be wiki-like, as the
 Wikipedia authority file is, so that it supports renames, merges, and
 splits with version history and minimal overhead; hence I wish to see
 a wiki for this sort of metadata.
 
 Currently OL does not quite provide this authority file, but it could.
  I do not know how easily.
 
 Every book ever published means more than 10 million records.
 (It probably means more than 100 million records.) OCR cleaning
 attracts hundreds or a few thousand volunteers, which is
 sufficient to take on thousands of books, but not millions.
 
 Focusing efforts on notable works with verifiable OCR, and using the
 sorts of helper tools that Greg's paper describes, I do not doubt that
 we could effectively clean and publish OCR for all primary sources
 that are actively used and 

[Wikisource-l] Info/Law blog: Using Wikisource as an Alternative Open Access Repository for Legal Scholarship

2009-06-20 Thread Yann Forget
Hello, This concerns Wikisource. Regards, Yann

 Original Message 
Subject: [Foundation-l] Info/Law blog: Using Wikisource as an
Alternative Open Access Repository for Legal Scholarship
Date: Sat, 20 Jun 2009 16:41:45 +0100
From: David Gerard

http://blogs.law.harvard.edu/infolaw/2009/06/19/using-wikisource-as-an-alternative-open-access-repository-for-legal-scholarship/

Interesting. How well does this fit with what Wikisource does?


- d.

-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] Fwd: [gutvol-d] need a book scanned?

2009-03-05 Thread Yann Forget

 this has been available for a few months now,
 but hasn't gotten the attention that it deserves...
 
 in conjunction with the boston public library,
 the open library -- or is it the internet archive?,
 or is it the open content alliance?, i can never tell
 -- is now offering a scan on demand service...
 
 if a public-domain book in the boston pubic library
 hasn't yet been scanned, you can bump its priority
 up to the top of the list by requesting it be scanned.
 
 within 3-5 days, it'll be scanned and placed online,
 with all the various formats, which includes the o.c.r.

This is a nice feature, but 3-5 is a bit optimistic.
I order a few books and the delay is more in the order of 2 weeks.

 what a wonderful present, eh?
 
 check it out (so to speak):
http://openlibrary.org/bpl
 
 -bowerbird

Regards,

Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] help needed searching for pagescans and front covers

2008-08-15 Thread Yann Forget
Lars Aronsson wrote:
 Birgitte SB wrote:
 
 Then there are things like creating translation and adding value 
 to text by wikilinks.
 
 In my opinion, translations (performed by wiki volunteers) should 
 belong in Wikibooks and not in Wikisource, exactly because they 
 are not (pre-existing, external) sources but creative efforts.
 
 Copyright legislation recognizes translators just like authors, so 
 the copyright to a wiki-translation belongs to the translators, 
 who can license their work.  (I'm assuming that the original 
 authors are long dead and no longer can make such claims.) Whereas 
 most books on Wikisource are in the public domain, where none of 
 the wiki volunteers can claim copyright and thus cannot add any 
 free license.

I think that translations belong to Wikisource much better than any 
other Wikimedia projects, for the reasons also mentioned by others 
(copyright, DoubleWiki extension, etc.). We already have translations by 
Wikisource contributors, both of old [1] and recent texts [2].

There is no such thing as the perfect translation. That's one of the 
reason why commercial editors translate old texts again and again 
instead of reprinting old translations. So if we accept the old ones, I 
don't see why we could not accept new translations.

Regards,

Yann

[1] http://en.wikisource.org/wiki/J%27accuse (Zola)
[2] http://fr.wikisource.org/wiki/Libre_comme_Liberté (on Stallman)
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] License of the translations

2008-07-02 Thread Yann Forget
For information. Yann

 Original Message 
Subject: RE: License of the translations
Date: Thu, 19 Jun 2008 17:04:23 -0400
From: Hawkins, Kevin
To: Yann Forget

Yann,

I've updated the terms of use to correspond to our current contributor 
agreements:

http://quod.lib.umich.edu/d/did/call.html

We have begun discussions about CC licensing; however, the project 
directors are not keen on licensing the content of this project since 
the collection of translations is growing, and corrections are 
constantly made to texts.  So they would prefer that people keep coming 
back to the website.

Because the texts are not static, it really seems to me not to be a good 
fit for Wikisource.

Kevin

 -Original Message-
 From: Hawkins, Kevin 
 Sent: Friday, May 30, 2008 3:38 PM
 To: 'Yann Forget'
 Subject: RE: License of the translations
 
 Hello Yann,
 
 Thank you for your interest in the Encycloedia of Diderot and 
 d'Alembert: Collaborative Translation Project.  It's good to 
 hear that Wikisource has taken an interest in the resource, 
 which is published by the office where I work.
 
 Our publishing operation was founded at about the same time 
 as Creative Commons, so have continued operating in the 
 pre-CC era since then, allowing almost all of our content 
 creators to keep their copyright rather than forcing them to 
 give it to us.  However, as was suggested in your discussion, 
 this requires us to go back to the content creators every 
 time someone wants to make a new use of their work.  We have 
 long meant to revisit all of our publishing agreements to 
 approach everyone about some sort of CC licensing; in fact, 
 we've begun inventorying these agreements to determine 
 whether we can license any of these already.
 
 The Encycloedia of Diderot and d'Alembert: Collaborative 
 Translation Project is, I believe, our only publication that 
 provides more presents terms of use on the website rather 
 than simply saying to contact [EMAIL PROTECTED] for more 
 information.  However, Wikisource's user Eclecticology is 
 right to point out that the text is contradictory.  (It was 
 drafted before we had access to any copyright specialists.)  
 I will work with the project directors to revise this 
 language to make it clearer.
 
 As for providinvg CC-BY-SA licensing, we will need to revisit 
 agreements with past translators.  As above, we were already 
 considering doing something like this, but I can't yet say 
 how soon we could accomplish this.  Once it does happen, we 
 would of course make this clear on the website.
 
 Please let me know if you have any questions.
 
 Kevin Hawkins
 Scholarly Publishing Office
 University Library
 University of Michigan
 
  -Original Message-
  From: Yann Forget [mailto:[EMAIL PROTECTED] 
  Sent: Friday, May 30, 2008 6:39 AM
  To: diderot-info @ umich . edu
  Subject: License of the translations
  
  Hello,
  
  I would like some information about the license of the 
  translations of 
  the Encyclopédie. At Wikisource [1], we are interested to 
  collaborate on 
  this translation. However we would need that the license should be 
  compatible with our requirements. Would it be possible to 
 release the 
  translations under a Creative Commons license like CC-BY-SA? [2]
  
  [1] 
  http://en.wikisource.org/wiki/Wikisource:Scriptorium#Translati
  on_of_the_Encyclop.C3.A9die
  [2] http://creativecommons.org/licenses/by-sa/3.0/
  http://creativecommons.org/licenses/by-sa/3.0/legalcode
  
  Regards,
  
  Yann

-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Wiki to LaTeX (and PDF) converter

2008-02-29 Thread Yann Forget
Hello,
Also interesting for WS.
Yann

 Original Message 
Subject: [Textbook-l] Wiki to LaTeX (and PDF) converter
Date: Fri, 29 Feb 2008 11:19:26 +0100
From: Derbeth derbeth @ wp.pl
Reply-To: Wikimedia textbook discussion [EMAIL PROTECTED]
To: [EMAIL PROTECTED]

I have created an improved version of Hagindaz's Wiki2LaTeX 
(http://en.wikibooks.org/wiki/User:Hagindaz/Wiki2LaTeX) tool, which 
helps creating PDF versions of books. It can download the whole book 
with images and automatically generate LaTeX source code, which can be 
directly translated into a PDF file. Still, some knowledge of LaTeX is 
required, but the tool can greatly reduce time of preparing PDF version 
of a textbook.

Currently it supports output in two languages (Polish and English), but 
adding support for new ones is quite easy.

The description of the program can be found at 
http://en.wikibooks.org/wiki/User:Derbeth/javaLatex. Example output is 
here: http://pl.wikibooks.org/wiki/Grafika:C.pdf.

-- 
Derbeth
Jabber id: [EMAIL PROTECTED]

Wikisłownik to więcej niż słownik! Sprawdź: http://pl.wiktionary.org/

-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikisource-l


Re: [Wikisource-l] [Commons-l] DJVU

2007-10-02 Thread Yann Forget
Hello,

Rama Rama a écrit :
 I have more or less translated the page.
 http://commons.wikimedia.org/wiki/Help:Creating_a_DjVu_file
 
 I am surprised that this page is focused on MS-Windows and proprietary
 software. It addresses trivial questions of opening and saving file in
 different format to convert them in a very typical Microsoft way :
 
 Tiff files from Gallica can be opened in FineReader (even after the
 evaluation period is over). By exporting the pages into tiff (same
 format), it is possible to crop the margins with XnView.
 
 We might want to write something more rational.

This page was written mostly by Marc for Windows users.
Actually I am still a beginner on creating DJVU files on Linux and/or
with free software. Of course, that information needs to be added.

Regards,

Yann
-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/wikisource-l


[Wikisource-l] Fwd: Numérisation du Codex Sina iticus

2007-05-02 Thread Yann Forget
Ça intéresse Wikisource.

Cordialement,

Yann

 Message original 
Sujet: [WikiFR-l] Numérisation du Codex Sinaiticus
Date: Wed, 02 May 2007 05:48:36 +0200
De: Xavier Teyssier
Répondre à: Liste de diffusion de la Wikipédia francophone

Bonjour,

Apparemment, une des plus anciennes bibles connues va bientôt être
numérisée
gràce à la British Library. (source :
http://www.sur-la-toile.com/mod_News_article_3392___.html )
Quelqu'un a t'il déjà entendu parler de ce projet, et surtout, savez vous
quelle est la politique de la British Library quant à la libération des
oeuvres numérisées ?
Serait-ce intéressant que Wikimédia France prenne contact avec eux pour
faire pencher la balance du bon côté ?

Cordialement,

Xavier Teyssier

-- 
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres

___
Wikisource-l mailing list
Wikisource-l@lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/wikisource-l