Re: [CODE4LIB] Open Source Discovery Portal Camp - November 6 - Philadelphia

2008-10-14 Thread Ogg,Dennis
I was wondering if anybody might be looking to share a hotel room on Wed. and 
Thurs. nights for the Open Source Discovery Portal Camp at PALINET. If so, 
please contact me off list.

Thanks,

Dennis

Dennis C. Ogg
Research and Development Services
Colorado State University Libraries
Fort Collins, C0  80523-1019
970.491.4071
[EMAIL PROTECTED]


[CODE4LIB] Entry-level DL programmer position

2008-10-14 Thread Hugh Cayless
We're hiring for a one-year position to work on Documenting the  
American South (http://docsouth.unc.edu) and related projects.


Please see the announcement here: 
https://s4.its.unc.edu/RAMS4/details.do?reqId=0809408&type=S

Best,
Hugh

/**
 * Hugh A. Cayless, Ph.D
 * Head, Research & Development Group
 * Carolina Digital Library and Archives
 * UNC Chapel Hill
 * [EMAIL PROTECTED]
 */


Re: [CODE4LIB] A metadata tool that scales

2008-10-14 Thread David Kennedy

Will,

We had very similar requirements to those that are in your email and on your
blog.  And we developed a web-based administrative interface to our fedora
repository to meet our own needs.  It allows us to directly manage the
metadata and content in our repository.  It is scalable, in that we can
manage all of our different digital object types (image, tei essay, video,
ead finding aid, book, and collection) through one interface.

Some of the features of our admin interface:
* ability to create new digital objects
* ability to search existing digital objects
* ability to upload/replace/delete content
* ability to edit metadata
* ability to assign digital objects to collection(s)
* ability to feature digital objects in collection(s)
* data integrity checking based on digital object type and collection-based 
metadata rules; digital object candidate statuses
* automated creation of derivatives (at least for images - thumbnails and 
zoomify tilesets)

* web-based descriptive metadata editing screens
* ability to manipulate structural and relationship metadata of digital 
objects

* controlled vocabulary for metadata entry

Some features being worked on:
* dynamic lookup during metadata entry for semi-controlled vocabulary lists
* online help for metadata editing

In our admin interface, we have developed a tool that meets our needs, but
is not flexible to meet the needs of the field.  It is currently tied to 
our

Fedora implementation, our data model, and our choice of metadata schemes.

Dave


--
David Kennedy
Manager, Digital Collections and Research
University of Maryland
3199 McKeldin Library
College Park, MD 20742
[EMAIL PROTECTED]
(301) 405-9051
(301) 314-9408 FAX


Will Sexton wrote:

In January of 2007 I sent a post to the Web4lib list titled "Metadata
tools that scale."  At Duke we were seeking opinions about a software
platform to capture metadata for digital collections and finding
databases.  The responses to that inquiry suggested that what we were
seeking didn't exist.

About a year ago, an OCLC report on a survey of 18 member institutions,
"RLG Programs Descriptive Metadata Practices Survey Results," supported
that basic conclusion.  When asked about the tools that they used to
"create, edit and store metadata descrptions" of digital and physical
resources, a sizable majority responded "customized" or "homegrown" tool.

Since my initial inquiry, we launched a new installation of our digital
collections at http://library.duke.edu/digitalcollections/.  Yet we still
lack a full-featured software platform for capturing descriptive metadata.

We did our own informal survey of peer institutions building digital
collections, which further reinforced that familiar conclusion -- there
are lots of Excel spreadsheets, Access and FileMaker databases, etc., out
there, but no available enterprise-level solution (and we're still happy
to be wrong on this point).

We also articulated a detailed series of specifications for a metadata
tool.  The library has committed to hiring two programmers each to a
two-year appointment for producing a tool that meets these specs.  I just
posted on this list the job description, for which there are two openings.

I have a longer version of this post on our digital collections blog
(http://library.duke.edu/blogs/digital-collections/2008/10/10/a-metadata-tool-that-scales/),
listing our specifications in more detail.  But here are some of the
basics:

* Digitization:  integrates with, or provides a module for, management of
digitization workflow.

* Description:  supports a collections-based data model; flexible metadata
schema (for us, the "Duke Core", derived from qualified Dublin Core);
authority lists; cardinality and required-field constraints; metametadata
(i.e., flagging, notations and status indicators for individual items);
access control; simple and intuitive use.

* Publication:  exports METS documents as well as other common formats
(CSV, etc.).

* Asset Management:  must be compatible with an asset management policy.

While the Duke specifications are particular to our internal needs, I
think we captured a lot of what makes the need for a full-featured
metadata tool felt around the field.  I have some ideas about how to go
about implementing this set of specifications, but thought I'd see if the
concept might spur discussion on CODE4LIB.  How would you approach this
project?  Any thoughts on architecture, platform, data models,
methodologies?

Will
--
Will Sexton
Metadata Analyst / Programmer
Duke University Libraries



Re: [CODE4LIB] another distribution question: got it!

2008-10-14 Thread Ken Irwin

Quoth Erik:

There is no difference between tar files created on 32 bit or 64 bit
machines. There can be differences between GNU tar & (for instance)
solaris tar. What problem are you having specifically?

This is the error I was getting:

tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors

Our sysadmin thought the problem was because server #1 has a 64-bit 
architecture, and server #2 is 32-bit machine.


But Erik's hint on the GNU/Solaris divide lead me in the right 
direction: I found someone referencing a tar+gnu+solaris problem when 
files are FTP'd as text, not as binary. I transferred the files in 
binary mode and everything was fine. I had mistakenly believed that was 
the default mode...


Thanks!
Ken


Re: [CODE4LIB] another distribution question: 32- or 64-bit tarballs

2008-10-14 Thread Erik Hetzner
At Tue, 14 Oct 2008 10:21:59 -0400,
Ken Irwin <[EMAIL PROTECTED]> wrote:
>
> Hi all,
>
> Thanks to everyone who responded last week about creating an
> installation workflow. I've got that mostly sorted out, and am on to the
> next stage.
>
> I thought it would be a simple matter to TAR up my pile of files. But as
> soon as I tried installing the package on a new server, I ran into
> trouble: my library server is a 64-bit machine, and the second server is
> 32-bits. I normally consider that information to be unimportant to daily
> life (read: I really don't know or usually care...). But to my surprise
> that seems to mean that the tarball from one doesn't work on the other.
>
> Is this a common problem? Is there a way around it? Are tarballs really
> mutually unintelligible? I don't generally recall seeing two versions of
> software being distributed. Is there a standard approach to dealing with
> this? It's just a pile of text (php, sql, html) and image files --
> there's no compiled code of any sort. I would have thought it was sort
> of architecture-neutral. Except it seems that the packaging mechanism
> itself is a problem.
>
> I did check out the book Erik recommended: http://producingoss.com/ to
> see what it has to say about this matter; all it says is "Use Tar!" with
> no ambiguity about architecture. Is everyone in the world on 64-bit
> architecture except this one test server that I have access to?
>
> Any advice?

There is no difference between tar files created on 32 bit or 64 bit
machines. There can be differences between GNU tar & (for instance)
solaris tar. What problem are you having specifically?

best,
Erik Hetzner
;; Erik Hetzner, California Digital Library
;; gnupg key id: 1024D/01DB07E3


pgpAfOvcg873X.pgp
Description: PGP signature


Re: [CODE4LIB] another distribution question: 32- or 64-bit tarballs

2008-10-14 Thread Doran, Michael D
Hi Ken,

> ...the tarball from one doesn't work on the other.

Can you be a little more specific as to what error/problem you encountered?

-- Michael

# Michael Doran, Systems Librarian
# University of Texas at Arlington
# 817-272-5326 office
# 817-688-1926 mobile
# [EMAIL PROTECTED]
# http://rocky.uta.edu/doran/
  

> -Original Message-
> From: Code for Libraries [mailto:[EMAIL PROTECTED] On 
> Behalf Of Ken Irwin
> Sent: Tuesday, October 14, 2008 9:22 AM
> To: CODE4LIB@LISTSERV.ND.EDU
> Subject: [CODE4LIB] another distribution question: 32- or 
> 64-bit tarballs
> 
> Hi all,
> 
> Thanks to everyone who responded last week about creating an 
> installation workflow. I've got that mostly sorted out, and 
> am on to the 
> next stage.
> 
> I thought it would be a simple matter to TAR up my pile of 
> files. But as 
> soon as I tried installing the package on a new server, I ran into 
> trouble: my library server is a 64-bit machine, and the 
> second server is 
> 32-bits. I normally consider that information to be 
> unimportant to daily 
> life (read: I really don't know or usually care...). But to 
> my surprise 
> that seems to mean that the tarball from one doesn't work on 
> the other.
> 
> Is this a common problem? Is there a way around it? Are 
> tarballs really 
> mutually unintelligible? I don't generally recall seeing two 
> versions of 
> software being distributed. Is there a standard approach to 
> dealing with 
> this? It's just a pile of text (php, sql, html) and image files -- 
> there's no compiled code of any sort. I would have thought it 
> was sort 
> of architecture-neutral. Except it seems that the packaging mechanism 
> itself is a problem.
> 
> I did check out the book Erik recommended: 
> http://producingoss.com/ to 
> see what it has to say about this matter; all it says is "Use 
> Tar!" with 
> no ambiguity about architecture. Is everyone in the world on 64-bit 
> architecture except this one test server that I have access to?
> 
> Any advice?
> 
> Thanks!
> Ken
> 
> -- 
> Ken Irwin
> Reference Librarian
> Thomas Library, Wittenberg University
> 


[CODE4LIB] another distribution question: 32- or 64-bit tarballs

2008-10-14 Thread Ken Irwin

Hi all,

Thanks to everyone who responded last week about creating an 
installation workflow. I've got that mostly sorted out, and am on to the 
next stage.


I thought it would be a simple matter to TAR up my pile of files. But as 
soon as I tried installing the package on a new server, I ran into 
trouble: my library server is a 64-bit machine, and the second server is 
32-bits. I normally consider that information to be unimportant to daily 
life (read: I really don't know or usually care...). But to my surprise 
that seems to mean that the tarball from one doesn't work on the other.


Is this a common problem? Is there a way around it? Are tarballs really 
mutually unintelligible? I don't generally recall seeing two versions of 
software being distributed. Is there a standard approach to dealing with 
this? It's just a pile of text (php, sql, html) and image files -- 
there's no compiled code of any sort. I would have thought it was sort 
of architecture-neutral. Except it seems that the packaging mechanism 
itself is a problem.


I did check out the book Erik recommended: http://producingoss.com/ to 
see what it has to say about this matter; all it says is "Use Tar!" with 
no ambiguity about architecture. Is everyone in the world on 64-bit 
architecture except this one test server that I have access to?


Any advice?

Thanks!
Ken

--
Ken Irwin
Reference Librarian
Thomas Library, Wittenberg University


Re: [CODE4LIB] A metadata tool that scales

2008-10-14 Thread Jonathan Rochkind
I believe that the Rochester XC project involves a component focused on 
building such a tool, you may want to inquire/coordinate with them.

Will Sexton wrote:
> In January of 2007 I sent a post to the Web4lib list titled "Metadata
> tools that scale."  At Duke we were seeking opinions about a software
> platform to capture metadata for digital collections and finding
> databases.  The responses to that inquiry suggested that what we were
> seeking didn't exist.
>
> About a year ago, an OCLC report on a survey of 18 member institutions,
> "RLG Programs Descriptive Metadata Practices Survey Results," supported
> that basic conclusion.  When asked about the tools that they used to
> "create, edit and store metadata descrptions" of digital and physical
> resources, a sizable majority responded "customized" or "homegrown" tool.
>
> Since my initial inquiry, we launched a new installation of our digital
> collections at http://library.duke.edu/digitalcollections/.  Yet we still
> lack a full-featured software platform for capturing descriptive metadata.
>
> We did our own informal survey of peer institutions building digital
> collections, which further reinforced that familiar conclusion -- there
> are lots of Excel spreadsheets, Access and FileMaker databases, etc., out
> there, but no available enterprise-level solution (and we're still happy
> to be wrong on this point).
>
> We also articulated a detailed series of specifications for a metadata
> tool.  The library has committed to hiring two programmers each to a
> two-year appointment for producing a tool that meets these specs.  I just
> posted on this list the job description, for which there are two openings.
>
> I have a longer version of this post on our digital collections blog
> (http://library.duke.edu/blogs/digital-collections/2008/10/10/a-metadata-tool-that-scales/),
> listing our specifications in more detail.  But here are some of the
> basics:
>
> * Digitization:  integrates with, or provides a module for, management of
> digitization workflow.
>
> * Description:  supports a collections-based data model; flexible metadata
> schema (for us, the "Duke Core", derived from qualified Dublin Core);
> authority lists; cardinality and required-field constraints; metametadata
> (i.e., flagging, notations and status indicators for individual items);
> access control; simple and intuitive use.
>
> * Publication:  exports METS documents as well as other common formats
> (CSV, etc.).
>
> * Asset Management:  must be compatible with an asset management policy.
>
> While the Duke specifications are particular to our internal needs, I
> think we captured a lot of what makes the need for a full-featured
> metadata tool felt around the field.  I have some ideas about how to go
> about implementing this set of specifications, but thought I'd see if the
> concept might spur discussion on CODE4LIB.  How would you approach this
> project?  Any thoughts on architecture, platform, data models,
> methodologies?
>
> Will
> --
> Will Sexton
> Metadata Analyst / Programmer
> Duke University Libraries
>

-- 
Jonathan Rochkind
Digital Services Software Engineer
The Sheridan Libraries
Johns Hopkins University
410.516.8886 
rochkind (at) jhu.edu




---
Jonathan Rochkind
Digital Services Software Engineer
The Sheridan Libraries
Johns Hopkins University
410.516.8886 
[EMAIL PROTECTED]


Re: [CODE4LIB] Vote for NE code4lib meetup location

2008-10-14 Thread Jay Luker
New England code4libbers,

Today's your last chance to vote your preference for where we gather. The
location voting thingy will close at 11:55pm. Our current strategy in the
event of a tie is to hope there isn't one. Maybe folks (like me) who are OK
with any of the choices could check the results later tonight and move some
of their ballot points around if it looks like a tie is in the cards.

--jay


On Fri, Oct 10, 2008 at 9:24 AM, Jay Luker <[EMAIL PROTECTED]> wrote:

> It's time to do a quick vote on where we'd like to hold our first New
> England gathering. If you are interested in attending please cast your
> ballot at http://dilettantes.code4lib.org/voting_booth/election/index/5.
>
> We'll keep voting open for a few days (at least through Tuesday). You
> can give from 0 to 3 points to each location, so more points to your
> 1st choice, less to your 2nd fave, etc. You can also go back and
> change you votes any time. Use the same login as you would at the main
> code4lib.org site. Yes, this means you have to be registered at
> code4lib.org already. Hopefully you got all that straightened out when
> you cast you ballot for the 2009 conference keynotes, right?
>
> Thanks to Ross Singer and his supertastic Diebold-O-Tron for setting this
> up.
>
> --jay
>


Re: [CODE4LIB] Position available, Project Analyst, Oxford University (kw's: semantic web, linked data)

2008-10-14 Thread John Fereira

Benjamin O'Steen wrote:

In a nutshell, we are building a system to capture the research
information infrastructure: linking people, departments, grants,
funders, articles, theses, books and data together. 


(Technical information: using RDF and a mix of published and homegrown
ontologies, and using an objectstore to act as a serialised, archival
base for the information. Evidence, context and provenance will be a
strong focus.)
  
You might want to take a look at the Vivo project developed here at 
Cornell.  It does exactly what you are describing above.  Check out:


http://vivo.cornell.edu/
http://vitro.mannlib.cornell.edu/

You might want to contact Jon  Corson-Rikert, the project manager 
directly at:  [EMAIL PROTECTED]


[CODE4LIB] Position available, Project Analyst, Oxford University (kw's: semantic web, linked data)

2008-10-14 Thread Benjamin O'Steen
In a nutshell, we are building a system to capture the research
information infrastructure: linking people, departments, grants,
funders, articles, theses, books and data together. 

(Technical information: using RDF and a mix of published and homegrown
ontologies, and using an objectstore to act as a serialised, archival
base for the information. Evidence, context and provenance will be a
strong focus.)

 Forwarded Message 
> From: Sally Rumsey <[EMAIL PROTECTED]>
> Subject: BRII job ad

> OXFORD UNIVERSITY LIBRARY SERVICES, Systems and E-Research Service
> (SERS)
> 
> Building the Research Information Infrastructure (BRII) Project
> 
> BRII Project Analyst
> 
> Oxford
> 
> Grade 7: Salary £27,466 - £33,780 p.a.
> 
> Full time, fixed term to March 2010
> 
>  
> 
> ORA (Oxford University Research Archive) the repository for Oxford
> research outputs and Oxford Medical Sciences Division Research
> Database Service have joined forces to create an innovative solution
> for research information management. The JISC-funded project to create
> this new system, BRII, will forge connections between researchers,
> grants, projects and publications. It will provide web-based services
> to disseminate and reuse this information in new contexts and for new
> purposes.
> 
>  
> 
> We are seeking a project analyst who has excellent demonstrable
> communication skills, both written and oral, who will liaise between
> BRII project staff including software developers, and members of staff
> in academic and administrative departments across the University. You
> will be able to communicate the purpose and design of the project,
> technical developments and plans to non-technical end users.
> 
>  
> 
> Working as part of a small team, your duties will include running
> stakeholder and user analyses, synthesising the findings and
> translating them into requirements to be used by technical developers.
> You will consult with end users, be involved with running testing and
> with dissemination of the project.
> 
>  
> 
> Further details and application form are available from OULS
> Personnel: tel 01865 277622, or email [EMAIL PROTECTED] From
> 17th October further details and an application form available from
> www.ox.ac.uk/about_the_university/jobs/index.html. The revised closing
> date for applications is 5pm on Friday 7th November 2008. 
> 
>  
> 
> Please quote reference BL8088
> 
>  
> 
> The University of Oxford is an equal opportunities employer
> 
> _
> 
> Sally Rumsey
> 
> ORA Service & Development Manager
> 
> Oxford University Library Services
> 
> University of Oxford
> 
> [EMAIL PROTECTED]
> 
> 01865 283860
> 
>  
> 
>