[CODE4LIB] OAI9: Call for Posters

2015-03-19 Thread Thomas Krichel
  
  You are invited to submit a description in the form of a short
  abstract if you wish to bring a poster to the workshop giving details
  of your project. The poster should be of interest to OAI9
  participants and directly related to the general themes of the
  workshop (http://indico.cern.ch/event/332370/page/6).
  
  Posters will be displayed in Campus Biotech and an extended coffee
  break will take place on Thursday 18 June 2015. This will give
  attendees the chance to view these and discuss them with the
  author. Attendees will also have the opportunity to vote for the
  poster which delivers the most impact. Posters should be A0 in size
  (841 x 1189 mm) for portrait or A1 (594 x 841 mm) for landscape. Any
  special equipment requests should be addressed to the workshop
  organisers when a poster has been accepted.
  
  If your poster is accepted you should still register for the workshop
  as normal and you will be expected to pay your own expenses. Owing to
  the large demand on accommodation, we advise you to register early -
  you may cancel your registration later if your submission is not
  successful.
  
  Poster abstracts can be submitted between 16 March 2015 - 17 April
  2015 after a quick Lightweight Accounts registration process
  (different from the conference registration).  Decisions will be made
  on an ongoing basis (and no later than the end April) and communicated
  to the submitters (http://indico.cern.ch/event/332370/call-for-abstracts/)
  
  Key dates
  ==
  Abstract submission opening day:  16 March 2015
  Abstract submission deadline: 17 April 2015
  
  Print service 
  == 
  If you wish, your poster can be printed by the University of Geneva
  print service and delivered at Campus Biotech on Thursday 18th
  June. If you are interested, send your work in PDF format to
  dimitri.do...@unige.ch before 17th May 2015. (Please note that posters
  created with Microsoft PowerPoint should be sent in PDF and PPT
  formats.) Cost of this service is CHF 33.-, to be paid on delivery at
  the main desk.
  
  We look forward to receiving your abstracts – and seeing your posters.


  Cheers,

  Thomas Krichel  http://openlib.org/home/krichel
  skype:thomaskrichel


Re: [CODE4LIB] talking about digital collections vs electronic resources

2015-03-19 Thread Jenn C
Thank you so much for all the replies, these are all very helpful! When
building the prototype for this particular page listing digitized
collections, I had put Digital Collections as the header out of habit
essentially because I know that's what we call them. The group working on
the page is going to do some more thinking about the labeling.

(To give some more info on what we were trying to do: this is a list of
collection-level records for collections Cornell has digitized. Cataloged
digitized collections can definitely be found along with everything else in
the catalog. The purpose of the list is to highlight these collections
and to perhaps make them easier to find. We don't have a digital
collection facet in our Blacklight catalog yet, though we like how
Stanford has set theirs up. These collections can be cataloged as a variety
of different formats as well - databases, websites, books, etc. so there
really isn't an obvious way to look at or narrow your search to them in the
catalog. It might be that the page won't get a lot of use because the
collections can be discovered in the catalog, but it will be available if
someone would like to see a list of such things.)

On Thu, Mar 19, 2015 at 7:57 AM, McDonald, Stephen steve.mcdon...@tufts.edu
 wrote:

 My question would be, why are you trying to keep them separate?  Why not
 group them all together?  People don't want to have to look all over the
 place to find what they want.  They want it all in one place.



Re: [CODE4LIB] talking about digital collections vs electronic resources

2015-03-19 Thread McDonald, Stephen
My question would be, why are you trying to keep them separate?  Why not group 
them all together?  People don't want to have to look all over the place to 
find what they want.  They want it all in one place.


Re: [CODE4LIB] talking about digital collections vs electronic resources

2015-03-19 Thread Kyle Banerjee
On Wed, Mar 18, 2015 at 9:51 AM, Laura Krier laura.kr...@gmail.com wrote:

 I think too often we present our collections to students through the
 framework of our own workflows and functional handling of materials


This.

We also try too hard to convey distinctions that aren't important to users
for the sake of technical accuracy. As a result, we sometimes introduce
problems that are worse than what we were trying to solve in first place.

There is also the issue that many people find library materials through
mechanisms other than the library provided silos -- particularly networked
resources. In reality, significant percentage of these users don't even
realize they're using the library.

kyle


Re: [CODE4LIB] Deep Freeze + PaperCut + Windows profiles

2015-03-19 Thread Riley Childs
We had a similar issue on our lab machines, there was a GPO that we created to 
fix the issue, I need to look in Group Policy though.
And what is so wrong with Windows 8.1? Part of our speed issues were resolved 
by reimaging onto 8.1 Enterprise.
//Riley

Sent from my Windows Phone

--
Riley Childs
Senior
Charlotte United Christian Academy
Library Services Administrator
IT Services Administrator
(704) 537-0331x101
(704) 497-2086
rileychilds.net
@rowdychildren
I use Lync (select External Contact on any XMPP chat client)

CONFIDENTIALITY NOTICE:  This email and any files transmitted with it are the 
property of Charlotte United Christian Academy.  This e-mail, and any 
attachments thereto, is intended only for use by the addressee(s) named herein 
and may contain confidential information that is privileged and/or exempt from 
disclosure under applicable law.  If you are not one of the named original 
recipients or have received this e-mail in error, please permanently delete the 
original and any copy of any e-mail and any printout thereof. Thank you for 
your compliance.  This email is also subject to copyright. No part of it nor 
any attachments may be reproduced, adapted, forwarded or transmitted without 
the written consent of the copyright ow...@cucawarriors.com


From: Dan Alexandermailto:dalexan...@nekls.org
Sent: ‎3/‎19/‎2015 3:36 PM
To: CODE4LIB@LISTSERV.ND.EDUmailto:CODE4LIB@LISTSERV.ND.EDU
Subject: Re: [CODE4LIB] Deep Freeze + PaperCut + Windows profiles

Any chance using a thaw space for that part of the profile?

On Thu, Mar 19, 2015 at 2:31 PM, Will Martin w...@will-martin.net wrote:

 In our computer labs, we currently use Deep Freeze.[1] It lets us grant
 our users full administrative rights, without worrying about malware,
 viruses, and such, because any changes the user makes are wiped out when
 they log off.

 A couple of years ago, the campus as a whole switched to PaperCut for
 managing print jobs.[2] This maintains separate print queues for each
 student, so that when they swipe their student card at the print release
 station, they see only their own print jobs.  Convenient!  At least
 compared to Pharos, the old system.

 Unfortunately, there's a nasty side-effect, which is that it takes a
 loong time to log into the lab computers.  Generally 5-6 minutes,
 sometimes as much as 10.  What's happening is:

 1) A student logs in with their Active Directory credentials
 2) The computer checks for a user profile and doesn't find one
 3) The computer creates a new windows profile for the student (slooow!)
 4) When they log off, Deep Freeze wipes out the profile.

 The fact that the computer has to download, install, and configure the
 PaperCut print drivers makes Step 3 even slower.  They're per-user.
 They're baked into the user profile, so they get created fresh every time
 and wiped out again afterwards.

 As a recent comment on Yik-Yak put it: Patience is waiting for the
 library computers to log you on.

 We're currently on Windows 8 (yuck), but the problem occurred with 7 as
 well.

 We've talked about removing Deep Freeze and simply placing the computers
 on restricted accounts with no permissions to install software, etc.  That
 would *partially* address it, because profiles would no longer be wiped
 out.  As long as students went to the same computer over and over, they'd
 only be faced with a long logon the first time.  But, of course, it's a lab
 and there's no guarantee you can get the same computer all the time, so
 that's a poor solution at best.



 [1] http://www.faronics.com/products/deep-freeze/enterprise/
 [2] http://www.papercut.com/




--
Dan Alexander
Technology Coordinator
Northeast Kansas Library System

785-838-4090
4317 W. 6th St.
Lawrence, KS 66049

*WANT ME TO REMOTE INTO YOUR COMPUTER?*
Download the NEKLS hosted ScreenConnect software to your computer from this
link:

*goo.gl/Lwg33y http://goo.gl/Lwg33y*

Once you have run the software, NEKLS staff will be able to access your
computer.


[CODE4LIB] Job: Digital Humanities Intern (NYC or Ann Arbor) at JSTOR

2015-03-19 Thread jobs
Digital Humanities Intern (NYC or Ann Arbor)
JSTOR
New York City

At ITHAKA, we think nothing is better than knowing we are having a positive
impact on the world. We impact the lives of millions of people and thousands
of institutions every day both in ground-breaking ways and in small ways that
mean a lot.

  
Our Organization

  
We have a bold mission. We work with leaders in the global higher education
community to advance and preserve knowledge and to improve teaching and
learning through the use of digital technologies. We are passionate about the
value of education and are driven to deploy technologies to make our
universities, colleges, and high schools better, more affordable, and more
effective, and to reach beyond these traditional walls to support learners
everywhere.

  
In two decades, we have launched some of the most transformative and widely
used services in higher education: JSTOR, Portico, and Ithaka S+R. Our 300+
employees in the United States, Europe, and Asia work closely with our user
communities, day in and day out, to build and continuously improve upon these
services, and to identify new opportunities to expand access to knowledge and
learning. As a successful organization in a demanding and dynamic environment,
we challenge ourselves to retain an entrepreneurial spirit that pursues and
embraces change.

  
Mission and Funding

  
We are proud to be a not-for-profit organization, but our status is a
reflection of our mission, not our funding model. Our work across these
services is highly valued in the global higher education community, and we
cover our costs by collecting fees in exchange for the access, preservation,
and research and consulting services we provide. That exerts a real discipline
on our operations in that we must continually adapt to the needs of our
audiences to be worthy of their support. Thousands of higher education and
related institutions around the world are ITHAKA's primary financial
supporters; their JSTOR participation fees provide 88% of ITHAKA's revenue.
Our 99% renewal rate for JSTOR generates stable recurring income, alleviating
the dependency on fundraising that many not-for-profits experience. Because
the organizations share our commitment to our mission, we have the financial
resources necessary to maintain a great work environment that encourages
innovation and excellence.

  
The Role

  
The Digital Humanities Intern (DH) will work the JSTOR Labs team to extend
Understanding Shakespeare, its partnership project with the Folger Shakespeare
Library. Understanding Shakespeare has shown a new way of connecting primary
texts with the literature about them, and the DH Intern will play a pivotal
role in making this resource even more transformative. To do so, the DH Intern
will work with the JSTOR Labs team to create a public API to the data within,
Understanding Shakespeare. He or she will then create a series of public
demonstrator visualizations and applications on top of the API, answering
questions such as: which plays have shown steady academic interest over time
and which have been the most trendy? How have quotation-rates of male vs.
female characters in Shakespeare's plays changed over time? What are the
differences between disciplines in most-cited play, character, and line?

  
You can view an exciting and informative video of the job description by
clicking on this link.

  
https://vimeo.com/121995805

  
  
Our organization and this role will provide you with an opportunity few other
companies can offer including:

  
• The content: JSTOR has an unparalleled breadth and depth of content that,
paired with the Folger's scrupulously tagged digital editions of Shakespeare's
plays, will give the DH Intern a significant scholarly sandbox to play in.

• The exposure: JSTOR's traffic and ability to publicize the DH Intern's
Demonstrator project provide a unique chance for your project to get the
visibility that will help your career.

• The team: Launched less than a year ago, the JSTOR Labs team has established
itself as deeply innovative and truly impactful both within ITHAKA and the
wider scholarly community.

• The opportunity: This is no make-copies-and-tag-along-while-we-work
internship. You will have the chance to have a true impact, and, in doing so,
will be expected to have the entrepreneurial skills needed to drive this
project through to completion.

  
The duration of this paid internship is approximately 8-10 weeks with the
possibility of an extension. While the DH Intern will need to be self-directed
and poly-skilled in order to accomplish this task, they will have ample
support in the JSTOR Labs team, who will assist technically in the creation of
the API and then in marketing and distributing of the demonstrator
applications. In addition, the DH Intern will be able to take advantage of
JSTOR's cutting edge and cloud-enabled technical infrastructure. The Labs
infrastructure runs on Linux-based servers hosted by Amazon Web 

[CODE4LIB] Job: Linder Digital Archive Summer Fellowship at The HistoryMakers

2015-03-19 Thread jobs
Linder Digital Archive Summer Fellowship
The HistoryMakers
Chicago

The HistoryMakers is pleased to announce The James A. Lindner Digital Archive
Summer Fellowship, in honor of James A. Lindner, for his leadership role in
the moving image archival profession, as well as his role in having the
Library of Congress serve as the permanent repository for The HistoryMakers
Collection. The James A. Lindner Digital Archive Summer Fellow should exhibit
a passion and commitment to working with digital moving image archives.

  
The purpose of The James A. Lindner Digital Archive Summer Fellowship is to
provide hands-on experience working with a one-of-a-kind digital video oral
history archive, and a professional and focused experience in archival work,
structured around processing and preservation of moving image archival
collections, migration and digitization, cataloging and archival descriptive
practices and standards. The James A. Lindner Digital Archive Summer
Fellowship is open to any individual who is interested in working with
collections of African American and/or video oral history materials and is a
recent graduate of a master's program in archival science, archival
management, digital archives, special collections, library science,
information management, computer science, or a related program prior to the
start date of the fellowship. Further application eligibility and guidelines
are outlined below.

  
The HistoryMakers is a growing and dynamic 501 (c)(3) not-for-profit
organization dedicated to creating an unprecedented national video oral
history archival institution recording the stories of both well-known and
unsung African American HistoryMakers. The goal is to record at least 5,000
oral history interviews and to expose this material to the public through
strategic media, technology, academic and community partnerships. In June
2014, the nation's foremost repository--the Library of Congress--announced
that it will serve as the permanent repositoryof The HistoryMakers collection.

  
Stipend: $5,000

Fellowship Duration: 10 weeks (Monday, June 1, 2015 - Friday, August 7, 2015)

  
Position Description: The James A. Lindner Digital Archive Summer Fellow's
primary tasks will include the arrangement, description and preservation of
The HistoryMakers Collection. The Fellow will help migrate digital footage,
enter metadata into The HistoryMakers FileMaker Pro database, and process The
HistoryMakers video oral history interviews, both analog and born-digital, as
well as captioning photographs and multimedia submissions. The Fellow will
gain valuable experience working with The HistoryMakers unique Digital Archive
and creating finding aids. The Fellow's duties will also include assisting in
digital curation and preparing descriptive, technical, and other metadata. The
Fellow will learn about employing best practices to ensure the long-term
availability and discoverability of the digital content in The
HistoryMakersCollection. The Fellow will work with FileMaker Pro as an
electronic resource for tracking and indexing collection materials online or
through other media. This includes managing the care and handling of born-
digital and analog collection materials yet to be digitized.

  
Eligibility

The requirements for consideration are:

  * Citizen or permanent resident of the United States.
  * GPA of 3.50 or higher.
  * Recent graduate (within six months) of a master's program in archival 
science, archival management, digital archives, special collections, library 
science, information management, computer science, or a related program.
  * Demonstrated interest in oral history interviews, archive administration 
and management. This interest can be demonstrated through academic coursework, 
volunteer or work experience, or through a personal statement in application 
essay.
  * Demonstrated interest in African American history. This interest can be 
demonstrated through academic coursework, volunteer or work experience, or 
through a personal statement in application essay.
Lodging: Lodging arrangements are the responsibility of the fellow. Applicants
will be provided with information on local housing options upon acceptance as
The James A. Lindner Digital Archive Summer Fellow.

  
Application Procedures:

Submit the following for consideration:

  * Cover letter stating your interest in the fellowship and your future career 
goals (please include an email address and a daytime telephone number).
  * Essay or written statement (750 - 1000 words) addressing one or all of the 
following:
  * What attracts you to The HistoryMakers archives (especially the Digital 
Archive) and/or the moving image profession;
  * Your interest in African American history and/or oral history interviews; 
and/or,
  * The importance of this fellowship to your future career.
  * Resume or CV indicating your academic background, work experience, and 
volunteer service.
  * Undergraduate and graduate transcript. Also include 

[CODE4LIB] Deep Freeze + PaperCut + Windows profiles

2015-03-19 Thread Will Martin
In our computer labs, we currently use Deep Freeze.[1] It lets us grant 
our users full administrative rights, without worrying about malware, 
viruses, and such, because any changes the user makes are wiped out when 
they log off.


A couple of years ago, the campus as a whole switched to PaperCut for 
managing print jobs.[2] This maintains separate print queues for each 
student, so that when they swipe their student card at the print release 
station, they see only their own print jobs.  Convenient!  At least 
compared to Pharos, the old system.


Unfortunately, there's a nasty side-effect, which is that it takes a 
loong time to log into the lab computers.  Generally 5-6 minutes, 
sometimes as much as 10.  What's happening is:


1) A student logs in with their Active Directory credentials
2) The computer checks for a user profile and doesn't find one
3) The computer creates a new windows profile for the student (slooow!)
4) When they log off, Deep Freeze wipes out the profile.

The fact that the computer has to download, install, and configure the 
PaperCut print drivers makes Step 3 even slower.  They're per-user.  
They're baked into the user profile, so they get created fresh every 
time and wiped out again afterwards.


As a recent comment on Yik-Yak put it: Patience is waiting for the 
library computers to log you on.


We're currently on Windows 8 (yuck), but the problem occurred with 7 as 
well.


We've talked about removing Deep Freeze and simply placing the computers 
on restricted accounts with no permissions to install software, etc.  
That would *partially* address it, because profiles would no longer be 
wiped out.  As long as students went to the same computer over and over, 
they'd only be faced with a long logon the first time.  But, of course, 
it's a lab and there's no guarantee you can get the same computer all 
the time, so that's a poor solution at best.




[1] http://www.faronics.com/products/deep-freeze/enterprise/
[2] http://www.papercut.com/


Re: [CODE4LIB] Deep Freeze + PaperCut + Windows profiles

2015-03-19 Thread Dan Alexander
Any chance using a thaw space for that part of the profile?

On Thu, Mar 19, 2015 at 2:31 PM, Will Martin w...@will-martin.net wrote:

 In our computer labs, we currently use Deep Freeze.[1] It lets us grant
 our users full administrative rights, without worrying about malware,
 viruses, and such, because any changes the user makes are wiped out when
 they log off.

 A couple of years ago, the campus as a whole switched to PaperCut for
 managing print jobs.[2] This maintains separate print queues for each
 student, so that when they swipe their student card at the print release
 station, they see only their own print jobs.  Convenient!  At least
 compared to Pharos, the old system.

 Unfortunately, there's a nasty side-effect, which is that it takes a
 loong time to log into the lab computers.  Generally 5-6 minutes,
 sometimes as much as 10.  What's happening is:

 1) A student logs in with their Active Directory credentials
 2) The computer checks for a user profile and doesn't find one
 3) The computer creates a new windows profile for the student (slooow!)
 4) When they log off, Deep Freeze wipes out the profile.

 The fact that the computer has to download, install, and configure the
 PaperCut print drivers makes Step 3 even slower.  They're per-user.
 They're baked into the user profile, so they get created fresh every time
 and wiped out again afterwards.

 As a recent comment on Yik-Yak put it: Patience is waiting for the
 library computers to log you on.

 We're currently on Windows 8 (yuck), but the problem occurred with 7 as
 well.

 We've talked about removing Deep Freeze and simply placing the computers
 on restricted accounts with no permissions to install software, etc.  That
 would *partially* address it, because profiles would no longer be wiped
 out.  As long as students went to the same computer over and over, they'd
 only be faced with a long logon the first time.  But, of course, it's a lab
 and there's no guarantee you can get the same computer all the time, so
 that's a poor solution at best.



 [1] http://www.faronics.com/products/deep-freeze/enterprise/
 [2] http://www.papercut.com/




-- 
Dan Alexander
Technology Coordinator
Northeast Kansas Library System

785-838-4090
4317 W. 6th St.
Lawrence, KS 66049

*WANT ME TO REMOTE INTO YOUR COMPUTER?*
Download the NEKLS hosted ScreenConnect software to your computer from this
link:

*goo.gl/Lwg33y http://goo.gl/Lwg33y*

Once you have run the software, NEKLS staff will be able to access your
computer.


Re: [CODE4LIB] Deep Freeze + PaperCut + Windows profiles

2015-03-19 Thread Dan Alexander
Faronics Data Igloo might actually be what you want...

retain vital data across restarts on a Frozen workstation in a Thawed
partition. The operating system is still on a Frozen partition and remains
fully protected.  With Data Igloo user created files, documents, settings,
favorites, AV Updates or even entire user profiles are retained across
reboots

On Thu, Mar 19, 2015 at 2:36 PM, Dan Alexander dalexan...@nekls.org wrote:

 Any chance using a thaw space for that part of the profile?

 On Thu, Mar 19, 2015 at 2:31 PM, Will Martin w...@will-martin.net wrote:

 In our computer labs, we currently use Deep Freeze.[1] It lets us grant
 our users full administrative rights, without worrying about malware,
 viruses, and such, because any changes the user makes are wiped out when
 they log off.

 A couple of years ago, the campus as a whole switched to PaperCut for
 managing print jobs.[2] This maintains separate print queues for each
 student, so that when they swipe their student card at the print release
 station, they see only their own print jobs.  Convenient!  At least
 compared to Pharos, the old system.

 Unfortunately, there's a nasty side-effect, which is that it takes a
 loong time to log into the lab computers.  Generally 5-6 minutes,
 sometimes as much as 10.  What's happening is:

 1) A student logs in with their Active Directory credentials
 2) The computer checks for a user profile and doesn't find one
 3) The computer creates a new windows profile for the student (slooow!)
 4) When they log off, Deep Freeze wipes out the profile.

 The fact that the computer has to download, install, and configure the
 PaperCut print drivers makes Step 3 even slower.  They're per-user.
 They're baked into the user profile, so they get created fresh every time
 and wiped out again afterwards.

 As a recent comment on Yik-Yak put it: Patience is waiting for the
 library computers to log you on.

 We're currently on Windows 8 (yuck), but the problem occurred with 7 as
 well.

 We've talked about removing Deep Freeze and simply placing the computers
 on restricted accounts with no permissions to install software, etc.  That
 would *partially* address it, because profiles would no longer be wiped
 out.  As long as students went to the same computer over and over, they'd
 only be faced with a long logon the first time.  But, of course, it's a lab
 and there's no guarantee you can get the same computer all the time, so
 that's a poor solution at best.



 [1] http://www.faronics.com/products/deep-freeze/enterprise/
 [2] http://www.papercut.com/




 --
 Dan Alexander
 Technology Coordinator
 Northeast Kansas Library System

 785-838-4090
 4317 W. 6th St.
 Lawrence, KS 66049

 *WANT ME TO REMOTE INTO YOUR COMPUTER?*
 Download the NEKLS hosted ScreenConnect software to your computer from
 this link:

 *goo.gl/Lwg33y http://goo.gl/Lwg33y*

 Once you have run the software, NEKLS staff will be able to access your
 computer.




-- 
Dan Alexander
Technology Coordinator
Northeast Kansas Library System

785-838-4090
4317 W. 6th St.
Lawrence, KS 66049

*WANT ME TO REMOTE INTO YOUR COMPUTER?*
Download the NEKLS hosted ScreenConnect software to your computer from this
link:

*goo.gl/Lwg33y http://goo.gl/Lwg33y*

Once you have run the software, NEKLS staff will be able to access your
computer.


Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

2015-03-19 Thread Andrew Nisbet
Elasticsearch is a no SQL database 
(http://www.slideshare.net/DmitriBabaev1/elastic-search-moscow-bigdata-cassandra-sept-2013-meetup)
 and much easier to install and manage than Mongo or CouchDB. 

Why 'boggle'? I it's a 'hello world' sketch, no exception guarding, hard coded 
URLs' and other embarrassing no-nos... 

... ok, fine https://github.com/anisbet/hist

Edmonton Public Library
Andrew Nisbet
ILS Administrator

T: 780.496.4058   F: 780.496.8317

-Original Message-
From: Code for Libraries [mailto:CODE4LIB@LISTSERV.ND.EDU] On Behalf Of Cary 
Gordon
Sent: March-19-15 1:15 PM
To: CODE4LIB@LISTSERV.ND.EDU
Subject: Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

Has anyone considered using a NoSQL database to store their logs? With enough 
memory, Redis might be interesting, and it would be fast.

The concept of too experimental to post to Github boggles the mind.

Cary


 On Mar 19, 2015, at 9:38 AM, Andrew Nisbet anis...@epl.ca wrote:
 
 Hi Bill,
 
 I have been doing some work with Symphony logs using Elasticsearch. It is 
 simple to install and use, though I recommend Elasticsearch: The Definitive 
 Guide (http://shop.oreilly.com/product/0636920028505.do). The main problem is 
 the size of the history logs, ours being on the order of 5,000,000 lines per 
 month. 
 
 Originally I used a simple python script to load each record. The script 
 broke down each line into the command code, then all the data codes, then 
 loaded them using curl. This failed initially because Symphony writes 
 extended characters to title fields. I then ported the script to python 3.3 
 which was not difficult, and everything loaded fine -- but took more than a 
 to finish a month's worth of data. I am now experimenting with Bulk 
 (http://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html)
  to improve performance.
 
 I would certainly be willing to share what I have written if you would like. 
 The code is too experimental to post to Github however.
 
 Edmonton Public Library
 Andrew Nisbet
 ILS Administrator
 
 T: 780.496.4058   F: 780.496.8317
 
 -Original Message-
 From: Code for Libraries [mailto:CODE4LIB@LISTSERV.ND.EDU] On Behalf Of 
 William Denton
 Sent: March-18-15 3:55 PM
 To: CODE4LIB@LISTSERV.ND.EDU
 Subject: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?
 
 I'm going to analyze a whack of transaction logs from our Symphony ILS so 
 that we can dig into collection usage.  Any of you out there done this?  
 Because the system is so closed and proprietary I understand it's not easy 
 (perhaps
 impossible?) to share code (publicly?), but if you've dug into it I'd be 
 curious to know, not just about how you parsed the logs but then what you did 
 with it, whether you loaded bits of data into a database, etc.
 
 Looking around, I see a few examples of people using the system's API, but 
 that's it.
 
 Bill
 --
 William Denton ↔  Toronto, Canada ↔  https://www.miskatonic.org/


Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

2015-03-19 Thread Jason Stirnaman
I've been using the ELK (elastic + logstash(1) + kibana)(2) stack for EZProxy 
log analysis.
Yes, the index can grow really fast with log data, so I have to be selective 
about what I store. I'm not familiar with the Symphony log format, but Logstash 
has filters to handle just about any data that you want to parse, including 
multiline. Maybe for some log entries, you don't need to store the full entry 
at all but only a few bits or a single tag?

And because it's Ruby underneath, you can filter using custom Ruby. I use that 
to do LDAP lookups on user names so we can get department and user-type stats.

1. http://logstash.net/
2. https://www.elastic.co/downloads


Jason

Jason Stirnaman, MLS
Application Development, Library and Information Services, IR
University of Kansas Medical Center
jstirna...@kumc.edu
913-588-7319

On Mar 19, 2015, at 2:15 PM, Cary Gordon listu...@chillco.com wrote:

 Has anyone considered using a NoSQL database to store their logs? With enough 
 memory, Redis might be interesting, and it would be fast.
 
 The concept of too experimental to post to Github boggles the mind.
 
 Cary
 
 
 On Mar 19, 2015, at 9:38 AM, Andrew Nisbet anis...@epl.ca wrote:
 
 Hi Bill,
 
 I have been doing some work with Symphony logs using Elasticsearch. It is 
 simple to install and use, though I recommend Elasticsearch: The Definitive 
 Guide (http://shop.oreilly.com/product/0636920028505.do). The main problem 
 is the size of the history logs, ours being on the order of 5,000,000 lines 
 per month. 
 
 Originally I used a simple python script to load each record. The script 
 broke down each line into the command code, then all the data codes, then 
 loaded them using curl. This failed initially because Symphony writes 
 extended characters to title fields. I then ported the script to python 3.3 
 which was not difficult, and everything loaded fine -- but took more than a 
 to finish a month's worth of data. I am now experimenting with Bulk 
 (http://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html)
  to improve performance.
 
 I would certainly be willing to share what I have written if you would like. 
 The code is too experimental to post to Github however.
 
 Edmonton Public Library
 Andrew Nisbet
 ILS Administrator
 
 T: 780.496.4058   F: 780.496.8317
 
 -Original Message-
 From: Code for Libraries [mailto:CODE4LIB@LISTSERV.ND.EDU] On Behalf Of 
 William Denton
 Sent: March-18-15 3:55 PM
 To: CODE4LIB@LISTSERV.ND.EDU
 Subject: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?
 
 I'm going to analyze a whack of transaction logs from our Symphony ILS so 
 that we can dig into collection usage.  Any of you out there done this?  
 Because the system is so closed and proprietary I understand it's not easy 
 (perhaps
 impossible?) to share code (publicly?), but if you've dug into it I'd be 
 curious to know, not just about how you parsed the logs but then what you 
 did with it, whether you loaded bits of data into a database, etc.
 
 Looking around, I see a few examples of people using the system's API, but 
 that's it.
 
 Bill
 --
 William Denton ↔  Toronto, Canada ↔  https://www.miskatonic.org/


Re: [CODE4LIB] Deep Freeze + PaperCut + Windows profiles

2015-03-19 Thread Will Martin

Crud, I sent that last without finishing it.

We've been chasing our tails in a circle over the issue for the last 
year and a half.  Any suggestions?


Will Martin


Re: [CODE4LIB] talking about digital collections vs electronic resources

2015-03-19 Thread Dave Caroline
And what percentage try the web before they come you your search,
knowing from experience you separated all the data into some silos
with obscure names. I settled on one overall search with facets in the
result.

Dave Caroline


Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

2015-03-19 Thread Michelle Suranofsky
Hi Bill,

I have been working on parsing our logs so we can migrate all of our
historical circ transactions into OLE.  I was recently able to use the data
pulled out of the logs to provide circ counts to our acq department for a
vendor provided spreadsheet of items/isbns (that we had purchased).

After using the Sirsi api to pull all of the charges and renewals out of
the logs I’ve been using java to parse through these text files and insert
the information into a sqlite database (as a ‘staging’ database).  From
there the transactions can be queried (and for me...prepped to migrate).

I would be happy to share my code/process with you.

Michelle
mis...@lehigh.edu

On Wed, Mar 18, 2015 at 5:55 PM, William Denton w...@pobox.com wrote:

 I'm going to analyze a whack of transaction logs from our Symphony ILS so
 that we can dig into collection usage.  Any of you out there done this?
 Because the system is so closed and proprietary I understand it's not easy
 (perhaps impossible?) to share code (publicly?), but if you've dug into it
 I'd be curious to know, not just about how you parsed the logs but then
 what you did with it, whether you loaded bits of data into a database, etc.

 Looking around, I see a few examples of people using the system's API, but
 that's it.

 Bill
 --
 William Denton ↔  Toronto, Canada ↔  https://www.miskatonic.org/


Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

2015-03-19 Thread Adam Constabaris
Bill,

If you are talking about parsing Sirsi transaction logs specifically, it's
fairly straightforward to do so with regular expressions and a small amount
of code.  We warehouse data extracted from our logs every night.

If you're talking about working with data retrieved from Sirsi's APIs  more
generally, quite a bit of that can also be done without too much effort.

cheers,

AC

On Thu, Mar 19, 2015 at 9:39 AM, Michelle Suranofsky mis...@lehigh.edu
wrote:

 Hi Bill,

 I have been working on parsing our logs so we can migrate all of our
 historical circ transactions into OLE.  I was recently able to use the data
 pulled out of the logs to provide circ counts to our acq department for a
 vendor provided spreadsheet of items/isbns (that we had purchased).

 After using the Sirsi api to pull all of the charges and renewals out of
 the logs I’ve been using java to parse through these text files and insert
 the information into a sqlite database (as a ‘staging’ database).  From
 there the transactions can be queried (and for me...prepped to migrate).

 I would be happy to share my code/process with you.

 Michelle
 mis...@lehigh.edu

 On Wed, Mar 18, 2015 at 5:55 PM, William Denton w...@pobox.com wrote:

  I'm going to analyze a whack of transaction logs from our Symphony ILS so
  that we can dig into collection usage.  Any of you out there done this?
  Because the system is so closed and proprietary I understand it's not
 easy
  (perhaps impossible?) to share code (publicly?), but if you've dug into
 it
  I'd be curious to know, not just about how you parsed the logs but then
  what you did with it, whether you loaded bits of data into a database,
 etc.
 
  Looking around, I see a few examples of people using the system's API,
 but
  that's it.
 
  Bill
  --
  William Denton ↔  Toronto, Canada ↔  https://www.miskatonic.org/



[CODE4LIB] ALA Annual 2015 Call for Proposals - ALCTS Technical Services Workflow Efficiency Interest Group

2015-03-19 Thread Glerum, Margaret
This message has been sent out to multiple lists. Please excuse any duplication.


The Technical Services Workflow Efficiency Interest Group (TSWEIG) invites 
proposals for presentations and/or discussion points for ALA's 2015 Annual 
Meeting in San Francisco. The group will be meeting Monday, June 29, 2015 from 
1:00-2:30 PM.



TSWEIG's charge is to provide a forum to exchange information and discuss 
techniques, new developments, problems, and technological advances, and 
emerging trends in the workflows associated with the evaluation, selection, 
acquisition, and discovery of library materials and resources.



If you or any of your colleagues are interested in discussing creative ways 
that Technical Services departments have made efficiency changes and/or 
implemented new services, submit your discussion topics and/or a proposal!



Please email your proposal and ideas directly to Michael Winecoff and Annie 
Glerum (not the listserv) by Tuesday, April 15, 2015.

Thanks,

Annie and Michael


Annie Glerum
Head of Complex Cataloging
Florida State University
agle...@fsu.eduhttps://exchange.fsu.edu/owa/redir.aspx?C=He7GiKRbP0mPDMQYUznm24yugI-nANIIZaeqZ8tgVZYYtyO1fpW00OtveRnrwxUw5JBMm_je6vo.URL=mailto%3aaglerum%40fsu.edu

Michael Winecoff
Associate University Librarian for Technical Services
University of North Carolina at Charlotte
mkwin...@uncc.edumailto:mkwin...@uncc.edu


Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

2015-03-19 Thread Cary Gordon
Has anyone considered using a NoSQL database to store their logs? With enough 
memory, Redis might be interesting, and it would be fast.

The concept of too experimental to post to Github boggles the mind.

Cary


 On Mar 19, 2015, at 9:38 AM, Andrew Nisbet anis...@epl.ca wrote:
 
 Hi Bill,
 
 I have been doing some work with Symphony logs using Elasticsearch. It is 
 simple to install and use, though I recommend Elasticsearch: The Definitive 
 Guide (http://shop.oreilly.com/product/0636920028505.do). The main problem is 
 the size of the history logs, ours being on the order of 5,000,000 lines per 
 month. 
 
 Originally I used a simple python script to load each record. The script 
 broke down each line into the command code, then all the data codes, then 
 loaded them using curl. This failed initially because Symphony writes 
 extended characters to title fields. I then ported the script to python 3.3 
 which was not difficult, and everything loaded fine -- but took more than a 
 to finish a month's worth of data. I am now experimenting with Bulk 
 (http://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html)
  to improve performance.
 
 I would certainly be willing to share what I have written if you would like. 
 The code is too experimental to post to Github however.
 
 Edmonton Public Library
 Andrew Nisbet
 ILS Administrator
 
 T: 780.496.4058   F: 780.496.8317
 
 -Original Message-
 From: Code for Libraries [mailto:CODE4LIB@LISTSERV.ND.EDU] On Behalf Of 
 William Denton
 Sent: March-18-15 3:55 PM
 To: CODE4LIB@LISTSERV.ND.EDU
 Subject: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?
 
 I'm going to analyze a whack of transaction logs from our Symphony ILS so 
 that we can dig into collection usage.  Any of you out there done this?  
 Because the system is so closed and proprietary I understand it's not easy 
 (perhaps
 impossible?) to share code (publicly?), but if you've dug into it I'd be 
 curious to know, not just about how you parsed the logs but then what you did 
 with it, whether you loaded bits of data into a database, etc.
 
 Looking around, I see a few examples of people using the system's API, but 
 that's it.
 
 Bill
 --
 William Denton ↔  Toronto, Canada ↔  https://www.miskatonic.org/


Re: [CODE4LIB] Deep Freeze + PaperCut + Windows profiles

2015-03-19 Thread Will Martin
Building profiles in a thawspace would be a partial solution; it'd allow 
for shorter login times if people go back to the same computer.


It'd be nice if we could pre-generate profiles for everybody, but the 
numbers don't work.


Each profile runs to to about 100 MB each;
We have 208 GB free on each lab machine;
and about 15,000 potential users.

So generating profiles for all of them -- assuming five minutes per 
profile -- would take 52 days of computing time at the beginning of each 
term, and require about 1.5 TB of space on each computer.


I'm hoping somebody will know a nifty trick for slimming down what needs 
to be created, or making PaperCut load faster, or something.


Will


[CODE4LIB] Job: Library Discovery and Integrated System Analyst/Coordinator at Princeton University

2015-03-19 Thread jobs
Library Discovery and Integrated System Analyst/Coordinator
Princeton University
Princeton

Responsible, along with colleagues in the Library Systems Office, for managing
the configuration and back office settings for the Library's discovery layer
products (currently Voyager, Primo, and Summon) as well as the integrated
resources management systems used by Library staff (currently Voyager,
Meridian, and SFX).

  
Also responsible for helping to create and regularly maintain the various
online and batch interfaces between the discovery layers and the ILS, as well
as from the ILS to third party systems, among them PeopleSoft, GFA, Borrow
Direct, Aeon, and others. Responsible for significant data extracting and
reporting using Access, SQL and other reporting tools. Complex reports and
queries are created for Technical and Public Services staffs, for ARL
statistics and other national organizations that collect library data. Manages
various data loads and feeds, including financial, patron, and some
bibliographic.

  
Manages the Library's Stack Map cloud based system which provides online maps
of our branch libraries and also the Library's instance of OnBase, a business
document management system used by several Library units. Plays an active
important role in managing system data integrity, keeping up with maintenance
requirements and new release installation oversight, and if need be, recovery
and restoration. Provides documentation for performing these various tasks,
especially those that lack documentation.

  
Analyze new library system products with the aim of making strategic
recommendations, choices, and decisions about next generation migration.
Manage such migrations, including comprehensive data migration as well as
configuration choices and policy decisions. Routinely interact and collaborate
with many staff in the Library and in various University departments, as well
as with software vendors.

  
The position reports to the Deputy University Librarian.

  
Applications will be accepted only from the Jobs at Princeton website:
http://www.princeton.edu/jobs and must include a resume, cover letter, and a
list of 3 references with full contact information.

  
**Essential Qualifications:**  

  * Bachelor's degree from an accredited academic university.
  * In-depth knowledge of Library Integrated System data formats and 
structures, both past and future.
  * Demonstrated experience with SQL/RDBMS systems.
  * Familiarity with using Linux/Unix.
  * Experience using at least two scripting tools, such as Visual Basic, BASH, 
PERL, PHP, Ruby and or Python, as well as experience with HTML/CSS.
  * Knowledge of library catalog data.
  * Knowledge of XML markup for library data.
  * Experience with Unicode, and library related non-Roman character encoding.
  * Ability to be able to troubleshoot issues with library systems or library 
data and to manage solutions.
  * Excellent communication skills.
  
**Preferred Qualifications:**  

  * MLS from an ALA accredited Library Information School.
  * Demonstrated knowledge of HTTP techniques and RESTful protocols.
  * Familiarity with Linked Data conventions.
  * Reading knowledge of at least one foreign language.
  * The final candidate will be required to complete a background check 
successfully.



Brought to you by code4lib jobs: http://jobs.code4lib.org/job/20003/
To post a new job please visit http://jobs.code4lib.org/


Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

2015-03-19 Thread Andrew Nisbet
Hi Bill,

I have been doing some work with Symphony logs using Elasticsearch. It is 
simple to install and use, though I recommend Elasticsearch: The Definitive 
Guide (http://shop.oreilly.com/product/0636920028505.do). The main problem is 
the size of the history logs, ours being on the order of 5,000,000 lines per 
month. 

Originally I used a simple python script to load each record. The script broke 
down each line into the command code, then all the data codes, then loaded them 
using curl. This failed initially because Symphony writes extended characters 
to title fields. I then ported the script to python 3.3 which was not 
difficult, and everything loaded fine -- but took more than a to finish a 
month's worth of data. I am now experimenting with Bulk 
(http://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html) 
to improve performance.

I would certainly be willing to share what I have written if you would like. 
The code is too experimental to post to Github however.

Edmonton Public Library
Andrew Nisbet
ILS Administrator

T: 780.496.4058   F: 780.496.8317

-Original Message-
From: Code for Libraries [mailto:CODE4LIB@LISTSERV.ND.EDU] On Behalf Of William 
Denton
Sent: March-18-15 3:55 PM
To: CODE4LIB@LISTSERV.ND.EDU
Subject: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

I'm going to analyze a whack of transaction logs from our Symphony ILS so that 
we can dig into collection usage.  Any of you out there done this?  Because the 
system is so closed and proprietary I understand it's not easy (perhaps
impossible?) to share code (publicly?), but if you've dug into it I'd be 
curious to know, not just about how you parsed the logs but then what you did 
with it, whether you loaded bits of data into a database, etc.

Looking around, I see a few examples of people using the system's API, but 
that's it.

Bill
--
William Denton ↔  Toronto, Canada ↔  https://www.miskatonic.org/