[CODE4LIB] Project Coordinator position @ UCLA
Hi all, The Southern Regional Library Facility @ UCLA is seeking a project coordinator to help with some exciting metadata and systems related projects - really interesting work in great environment. I'm glad to chat with anyone about the position - more info & application details available below. Full disclosure - I direct the sister regional library facility (the NRLF) at UC Berkeley. Best, Erik - The complete postings, which include the position descriptions, complete qualifications and application procedures, are available on both the UCLA Career Opportunities Website at: https://hr.mycareer.ucla.edu and on the UCLA Library Employment and HR Website, at: http://www.library.ucla.edu/about/employment-human-resources Reporting to the Director, Southern Regional Library Facility (SRLF) the Collections Project Coordinator (CPC) provides management and leadership for the Regional Library Facility Integrated Library System Design and Implementation Team (RLFILSDIT) project which addresses the integration of Northern Regional Library Facility (NRLF) integrated library system (ILS) data into UCLA integrated library system, Voyager and the Internal SRLF space reclamation (SR) project where by 100,000+ duplicate volumes currently housed in SRLF will be pulled, evaluated and reshelved or deselected. The CPC works with stakeholders to articulate goals, define scope, and prioritize deliverables; collaborates with project teams to develop project plans and timelines; monitors progress during project implementation and communicates status with stakeholders, SRLF project teams, and Library management. The CPC identifies and investigates new and/or improved work flows, develops record keeping methodologies and provides statistical reports. The Southern Regional Library Facility (SRLF) houses low-use library materials from the five southern UC campuses, and also houses the University of California Shared Print Archive. Located on the northwest corner of campus, the SRLF provides environmentally controlled high-density shelving for books, archives, and other library materials. Phase 1 and 2 Stacks have a combined capacity of approximately 7 million volume equivalents, and SRLF currently holds approximately 6.5 million volumes equivalents. -- Erik Mitchell http://erikmitchell.info
[CODE4LIB] Data Management position at UC Berkeley
Please forgive duplication. The University Library and Research IT at the University of California, Berkeley are currently recruiting for a Research Data Management Service Design Analyst to help define and develop a UC BerkeleyResearch Data Management service. This is an exciting opportunity to be part of a growing collaboration at UCB. *Search for job 19194 at http://jobs.berkeley.edu <http://jobs.berkeley.edu>* This position will work with a team of data management and library professionals in the following activities and goals: - Build community with researchers, service providers, and other stakeholders, on campus and with our external partners - Raise awareness regarding research data management needs on campus - Identify existing services, service gaps, and constraints and barriers that researchers experience as they manage research data - Provide outreach on and support for data management services on campus - Make it easier for researchers to find, procure and use existing services - Participate in experimental projects that demonstrate how UC Berkeley can begin to help researchers with their data management needs - Identify components for a multi-year program and budget request http://research-it.berkeley.edu/blog/15/01/20/join-our-team-research-data-management-service-design-analyst -- Erik Mitchell http://erikmitchell.info
[CODE4LIB] Director, Southern Regional Library Facility (UCLA)
Code4Lib colleagues - UCLA is looking for a Director of the Southern Regional Library Facility, a high density storage facility that serves the 10 campuses of University of California. In addition to managing the facility and staff, this position is responsible for strategic initiatives in collection management and shared print as well as a robust digitization operation. Speaking as a technology-focused librarian who serves as the Director of the Northern Regional Library Facility I can say that there are a lot of opportunities in this position to explore automation, digital preservation, shared collections and other exciting topics in LIS. If you would like to talk more about the position please feel free to get in touch. Erik -- Erik Mitchell Associate University Librarian Director of Digital Initiatives and Collaborative Services Director, Northern Regional Library Facility University of California, Berkeley emitch...@berkeley.edu http://erikmitchell.info http://www.library.ucla.edu/about/employment-human-resources/staff-positions Under the general direction of the Associate University Librarian (AUL) for Collection Management and Scholarly Communication, the Director of the Southern Regional Library Facility (SRLF) and Collaborative Shared Print Programs is responsible for the leadership, management and operations of the SRLF and for Collaborative Shared Print Programs. The Director manages the UC Southern Regional Library Facility (SRLF), a university-wide academic support program stewarding library materials including special collections, manuscripts, archives, audio-visual collections and content for the five southern campuses and stewarding the materials of the UC Shared Print Archives Program. Responsibilities include the planning for the growth of collaborative shared print activities, positioning the SRLF to play a leadership role in a network of shared print repositories, implementing innovative technical and other service enhancements to improve cross institutional sharing and management of collections and coordinating and overseeing preservation imaging services including large scale digitization and reformatting. SRLF is a large-scale, high density, environmentally controlled collection management facility located on the UCLA campus, with capacity for seven million volume equivalents. It serves the five southern campuses of the University of California: Irvine, Los Angeles, Riverside, San Diego, and Santa Barbara, as well as the northern UC campuses. The SRLF Preservation Imaging Service enables libraries to preserve fragile print materials through microfilm or digital formatting, and to share the resulting images with other libraries and the general public through Internet/Web access to the UCLA Digital Library and/or the California Digital Library, or though the less vulnerable medium of microfilm. The SRLF participates in the UC Shared Print Archive Program, providing storage for the print copy of select journal titles. The print archive programs held at the SRLF have grown to include the JSTOR Archive and UC Shared Print for Licensed Content (with content fully accessible online), and the Western Regional Storage Trust (WEST Archive) that includes 100+ member libraries and more than 400K journal volumes archived across the WEST membership. Applicants will be able to view and apply for this job until the Posting Expiration Date of 06-15-2015. You may view your posting and the applicants that have applied for this position by accessing UCLA ( https://hr.jobs.ucla.edu).
[CODE4LIB] Call for Submissions: Jesse H. Shera Award for Distinguished Published Research
Please forgive duplication *Jesse H. Shera Award for Distinguished Published Research* *Call for Submissions* The Library Research Round Table of the American Library Association announces the 2016 Jesse H. Shera Award for Distinguished Published Research. The deadline for submitting entries is *January 31, 2016*. The LRRT Shera Award Committee will judge the entries for the competition. The decision of the Committee will be announced by the LRRT Steering Committee Chair, prior to the Annual Conference. *Guidelines* 1. All entries must be research articles published in English during the *2015* calendar year. 2. Articles may be nominated by any member of LRRT or by the editors of research journals in the field of library and information studies. No one may nominate more than two articles. 3. All nominated articles must relate in at least a general way to library and information studies. Any research method is acceptable. 4. Authors of nominated articles need not be LRRT members. 5. Articles by joint investigators are eligible, as are articles generated as a result of a research grant or other source of funding. 6. Research articles will be judged on the following points: · Definition of the research problem; · Application of research methods; · Clarity of the reporting of the research; · Significance of the conclusions, as judged by the Committee. 7. The author(s) of the winning article will receive a Certificate. To nominate or submit an article (or articles) for the 2016 competition, e-mail an electronic copy of each article along with a cover letter, both in PDF format to: *o...@ala.org * with the subject line: *Shera Award, Published Research* Susan Rathbun-Grubb, MSLS, PhD – Chair, Shera Award Committee, LRRT Assistant Professor School of Library and Information Science University of South Carolina 1501 Greene St. Columbia, SC 29208 803.777.0485 srath...@mailbox.sc.edu -- Erik Mitchell http://erikmitchell.info
[CODE4LIB] Data/GIS technology lead at UC Berkeley Library
Hi all, The UC Berkeley University Libraries is seeking a Data services and GIS technology lead. This position will be part of a dynamic group focused on building a new suite of GIS services for the Library and will be positioned to have an impact in a large and exciting research environment. More information is at: http://www.lib.berkeley.edu/LHRD/currentjobs.html#21087 If you would like to talk with me about the position please get in touch Erik Mitchell -- Associate University Librarian Director of Digital Initiatives and Collaborative Services University of California, Berkeley emitch...@berkeley.edu http://erikmitchell.info
[CODE4LIB] IT Service developer @ UC Berkeley library
Hi all, The UC Berkeley library has an IT position open focused on helping us rethink our approach to service development and management. We are looking for a dedicated Library IT professional who has a passion for service and systems management, expertise in version control, continuous integration and test-driven development and an interest in working in a team-oriented environment. The full position description is at http://www.lib.berkeley.edu/LHRD/currentjobs.html#21197. I will be at Code4Lib next week if you would like to talk more about the position. Best, Erik Mitchell -- Erik Mitchell Associate University Librarian Director of Digital Initiatives and Collaborative Services University of California, Berkeley emitch...@berkeley.edu
[CODE4LIB] Survey of cloud computing adoption in libraries
Please forgive cross posting. This is a second call for participation in this study. Your participation is appreciated! This research study is about cloud computing and virtualization adoption in libraries. It asks questions about the level of adoption and factors that enable or inhibit the use of these technologies in library environments. The survey is open to anyone who works with IT related to libraries (e.g., systems departments, desktop support, campus IT department supporting the library, etc.). Even if your library does not use cloud computing or virtualization technologies your input is still valuable for understanding the landscape of this technology adoption in libraries. To take the survey please follow this link https://uncodum.qualtrics.com/SE/?SID=SV_6mmaLbFa2El3trK Erik Mitchell Assistant Professor College of Information Studies University of Maryland College Park
[CODE4LIB] Free your Metadata workshop - University of Maryland, College Park - Feb 13th
Hi all, Of possible interest to anyone in the Washington DC area . . . On February 13th at noon in Room 2119 of the Hornbake Library on the University of Maryland College Park campus, Seth van Hooland, Max De Wilde and Ruben Verborgh will coordinate a workshop creating Linked Open Data using common tools. The workshop will feature the work that the presenters have published at http://freeyourmetadata.org and is free and open to the public. To participate in the workshop please bring a laptop with Google Refine <http://code.google.com/p/google-refine/> installed For more information, please visit http://ipac.umd.edu/news-and-events/special-event-free-your-metadata-why-does-it-matter . For information on getting to the UMD College Park campus and parking please see http://www.transportation.umd.edu/. -- Erik Mitchell, PhD Assistant Professor College of Information Studies University of Maryland, College Park http://erikmitchell.info.
Re: [CODE4LIB] Any libraries have their sites hosted on Amazon EC2?
Hi Nate When I was at Wake Forest University we moved a large chunk of our web services to Amazon and it worked out well. We chose Amazon because at the time they were the clear leader in IaaS stuff but since then a number of providers (Linode and Rackspace are two) have emerged as alternatives. As for why we moved that is a long story :) Erik On Feb 21, 2012, at 10:40 PM, Nate Hill wrote: > Apologies for cross-posting. > If yes, I'd love to hear why you chose to and how that is working out for > you. > Thanks! > > -- > Nate Hill > nathanielh...@gmail.com > http://www.natehill.net
Re: [CODE4LIB] Any libraries have their sites hosted on Amazon EC2?
Great thread! At WFU we used reserved AWS instances which lowered our overall costs but committed us to the amazon platform for a year. We also wound up grouping most of our services on a large server (~$87 per month after reservation fee) so that we could take advantage of all of that capacity. Our infrastructure included 3 servers and about 500 GB of storage (Large production server with 90% of library services, 1 small server for High density storage system, 1 small server for puppet/monitoring/documentation). The reservation fees for these servers were around $1380 per year and we paid approximately $275 total per month for computing costs and disk space. Data transfer and other costs were minimal and are included in the $275. A rough yearly cost for these services comes to $4680. A bit more than we were looking at for physical server costs (3 servers for $4000 each with 3 years paid support) but these costs meant that we had a lot more flexibility than we would have had with three servers sitting in our campus IT datacenter (without root access). As a side note, we found that storage space was more expensive than CPU time and wound up keeping our multi-TB storage array on site instead of in the cloud. If I was re-building this today I would explore some other options - RackSpace (Cheaper CPU time), RightScale (Automated server configuration/deployment), Heroku/Google Apps Engine (free PaaS levels) and focus on getting at least a more robust infrastructure at the same cost (if not with some savings). Rackspace for example offers their smallest server at $.015 per hour without reservation fees and RightScale offers a free support level that could work well for small/medium sized libraries. FWIW, when I was pulling together numbers for this email I noticed that Amazon has changed their reservation fees and pricing model. Depending on which reservation fee I selected I either saved about $600 per year or spent $300 more per year using the same infrastructure discussed above (http://aws.amazon.com/ec2/pricing/). If you are really interested in cost calculations and ROI, some helpful resources include Yan Han's recent work in ITAL comparing real-world cloud computing costs - http://ejournals.bc.edu/ojs/index.php/ital/article/view/1871/1709 and Chapter 3 of George Reese's book "Cloud Computing Architectures" in which he explores some approaches to calculating ROI for cloud services. Erik Erik Mitchell Assistant Professor College of Information Studies University of Maryland, College Park http://erikmitchell.info, http://ischool.umd.edu On Thu, Feb 23, 2012 at 12:38 AM, Tim Spalding wrote: > We did some tests on it, but found it a very poor fit for a site > dependent on huge amount of data which much be "present" to the > basically the whole system all the time and up-to-date. In other > words, we found it didn't match a site based on MySQL slaves > replicating here and there, and with memcached needing to be spot-on. > Under some circumstances we'd consider shuffling some image rendering > and delivery tasks to it, but that's about it. > > Tim > LibraryThing
[CODE4LIB] Anyone implementing common LIS applications on PaaS providers?
Hi all, I have been toying with the process of implementing common LIS applications (e.g. Vufind, Dspace, Blacklight. . .) on PaaS providers like Heroku and Amazon Elastic Beanstalk. I have just tried out of the box distributions so far and have not made much progress but was wondering if someone else had tried this or had ideas about what issues I might run into. Thanks, Erik Erik Mitchell Assistant Professor College of Information Studies University of Maryland, College Park http://ischool.umd.edu
Re: [CODE4LIB] Anyone implementing common LIS applications on PaaS providers?
Chris - where did you deploy your SOLR instance and did that create any issues for deployment (other than ignoring files)? Erik On Thu, Mar 29, 2012 at 12:37 PM, Chris Fitzpatrick wrote: > Hey Sean, > > Jah, I did that...my .slugignore is: > tmp/* > log/* > coverage/* > spec/* > koha/* > jetty/* > > That dropped it down to 30 from ~50mb, so that's good . > (koha has some scripts wrote to pull from our ILS). > > I think the slug size is a really minor issue. Heroku says under 25mb > is good, but over 50mb is not so good. Not "Good", but not "Chaotic > Evil" . "Neutral Good". > > > > On Thu, Mar 29, 2012 at 6:26 PM, Sean Hannan wrote: >> If you already have everything indexed in Solr elsewhere, a way to cut down >> the BL slug size is to remove/ignore the SolrMarc.jar. It's pretty sizable. >> >> -Sean >> >> >> On 3/29/12 12:16 PM, "Chris Fitzpatrick" wrote: >> >>> Hi, >>> >>> I've deployed Blacklight on both Heroku and Elastic BeanStalk. >>> >>> Heroku is still a much better choice. The only issue I had was I >>> needed to make sure the sass-rails gem in installed in the :production >>> gem group and not just development. >>> >>> I still have an issue of getting heroku to compile all my >>> sass/coffeescript/etc assets on update, but it actually doesn't seem >>> to make much of an impact on performance. The minor issue is that it >>> would be nice to figure out a way to slim down BL's slug size. The >>> lowest I've been able to get it is about 30mb and Heroku recommends >>> having it be below 25mb. >>> >>> I have not used Heroku's solr service (I still use EC2 for my solr >>> deployments). >>> EngineYard would also be another option. >>> >>> There is also an AMI for DSpace, so deploying that to EC2 should be >>> pretty easy >>> >>> b,chris. >>> >>> >>> >>> On Thu, Mar 29, 2012 at 3:55 PM, Rosalyn Metz wrote: >>>> Erik, >>>> >>>> I haven't tried it (recently) on PaaS providers, but I have on IaaS. The >>>> AMIs I've created in association with start up scripts (if you're >>>> interested in seeing those let me know, I'd have to look for them somewhere >>>> or other) mean that the application automagically starts up on its own, all >>>> you need to do is go to the URL. I've used this as a back up method in the >>>> past and I think would be a great way for people to be able to play with >>>> the different apps before committing. >>>> >>>> To this end, I created an AMI for Blacklight a while back: >>>> http://www.rosalynmetz.com/ami-3c10f255/ I guarantee you it is grossly out >>>> of date. I also have instructions on creating an EBS backed AMI: >>>> http://rosalynmetz.com/ideas/2011/04/14/creating-an-ebs-backed-ami/ which >>>> is the method I used for creating the Blacklight AMI. These instructions >>>> are also fairly old, but I still get comments on my blog now and then that >>>> the method works. >>>> >>>> I also played around with it on Heroku, but that was so long ago I don't >>>> think any of the things I learned still apply (this was when Heroku was >>>> fairly new to the scene). Hope some of this helps. >>>> >>>> Rosalyn >>>> >>>> >>>> >>>> On Thu, Mar 29, 2012 at 8:34 AM, Seth van Hooland >>>> wrote: >>>> >>>>> Dear Erik, >>>>> >>>>> Bram Wiercx and myself have given a talk on how to put together a package >>>>> to install CollectiveAccess on Red Hat's OpenShift: >>>>> http://www.dish2011.nl/sessions/open-source-software-platform-collectiveacce >>>>> s-as-a-service-solution >>>>> . >>>>> >>>>> My students are currently happily playing around with CollectiveAccess, >>>>> which they have installed on OpenShift. My teaching assistant Max De Wilde >>>>> has developed clear guidelines on how to run the installation procedure: >>>>> http://homepages.ulb.ac.be/~svhoolan/redhat_ca_install.pdf. >>>>> >>>>> It would be wonderful to aggregate these kind of installation procedure's >>>>> for other types of LIS applications... >>>>> >>>&g
Re: [CODE4LIB] Anyone implementing common LIS applications on PaaS providers?
Neat! Thanks Mark, Erik On Thu, Mar 29, 2012 at 2:19 PM, Mark A. Matienzo wrote: > Like Chris, I've deployed Blacklight on Heroku, and this thread > (particularly Rosalyn's message) has gotten me to write up a quick > HOWTO on the Blacklight wiki [0]. > > For Solr hosting I've used both a VM that I run (on Slicehost) and EC2. > > Mark > > [0] https://github.com/projectblacklight/blacklight/wiki/Blacklight-on-Heroku
Re: [CODE4LIB] Anyone implementing common LIS applications on PaaS providers?
Thank you everyone for giving me some ideas to pursue. I'm going to explore this area a bit more and will be sure to report back if I manage to do something interesting. Erik On Thu, Mar 29, 2012 at 6:57 PM, Jonathan Rochkind wrote: > On 3/29/2012 5:05 PM, Chris Fitzpatrick wrote: >> >> locally and push them rather than rely on Heroku to precompile them >> (currently when I push, Heroku's precompile fails, so it reverts to >> "compile at runtime" mode) if anyone has insight into this, please >> lemme know...I believe having them compile at runtime does slow down >> the application... > > > Have no idea why it's not working in heroku, no experience with heroku > (although I'm familiar with the concept). > > But compile at runtime _will_ slow down your app, yeah. Here's a > stackoverflow I asked on it myself: > > http://stackoverflow.com/questions/8821864/config-assets-compile-true-in-rails-production-why-not > > Compiling locally and then pushing should work, and is arguably better in > some ways (why waste cycles on the production machine compiling assets?) > But, if you choose to compile and check into your source control repo, > here's a trick that will keep it from driving you crazy in development > using your on-disk compiled assets... eh, I can't find the blog post on > google now, but it's something like changing config.assets.path = > "/dev-assets" in environments/development.rb, so in development it will > ignore your on disk compiled assets.
[CODE4LIB] ALA Annual Session: Trends in Cloud Computing, Sunday June 24, 10:30am - 12:00pm
Please forgive duplicate postings: ALA Annual Session sponsored by LITA: Trends in Cloud Computing, Sunday June 24, 10:30am Please come out for a panel session featuring current research and projects on cloud computing at ALA Annual 2012. Presenters include Yan Han (Arizona State University), David Minor (San Diego Supercomputer Center of UC San Diego), Chris Tonjes (District of Columbia Public Library) and Erik Mitchell (University of Maryland). The 90 minute session will be in Room 206A in the Anaheim Convention Center and will start at 10:30am. More information is available at http://ala12.scheduler.ala.org/node/856 Erik -- Erik Mitchell Assistant Professor College of Information Studies University of Maryland, College Park http://ischool.umd.edu
[CODE4LIB] ALA Annual Session: Current Research on and Use of FRBR in Libraries, Sunday June 24, 8:00am - 10:00am
Please forgive duplicate postings: ALA Annual Session sponsored by ALCTS: Current Research on and Use of FRBR in Libraries, Sunday June 24, 8:00am Please come out for a panel session featuring current research on and use of FRBR at ALA Annual 2012. Presenters include Jennifer Bowen (University of Rochester), Thomas Hickey (OCLC), Carolyn McCallum (Wake Forest University), Erik Mitchell (University of Maryland, College Park), Athena Salaba (Kent State University) and Yin Zhang (Kent State University) The 120 minute session will be in Room 213AB in the Anaheim Convention Center and will start at 8:00am More information is available at http://ala12.scheduler.ala.org/m/node/255 Erik Mitchell -- Erik Mitchell Assistant Professor College of Information Studies University of Maryland
Re: [CODE4LIB] haititrust
Hi Eric I used an OCLC number match to get a sense of overlap at WFU - http://www.erikmitchell.info/2011/05/06/how-much-overlap-do-we-have-with-the-hathitrust/, http://www.erikmitchell.info/2011/05/07/more-on-hathitrust-overlap/. As I recall I simply pulled the oclc numbers from the MARC files (perhaps even just their spreadsheets) and did some simple database querying. More recently I have been working with the HT files using text similarity measures (e.g. pylevenshtein) to compare holdings across libraries. This takes a lot of CPU time but has proven to be a pretty good way to compare holdings at a title level and I suppose with a detailed enough text string (title, pub date, publisher...) you could focus the comparison on expressions/manifestations rather than just titles. Erik On Fri, Aug 3, 2012 at 11:15 AM, Jon Stroop wrote: > You can do an empty query in their catalog, and use the "Original Location" > facet to filter to a holding library. Programatically, I'm not sure, but > you'd probably need to use the Hathi files: > http://www.hathitrust.org/hathifiles. > > -Jon > > > On 08/03/2012 11:07 AM, Eric Lease Morgan wrote: >> >> If I needed/wanted to know what materials held by my library were also in >> the HaitTrust, then programmatically how could I figure this out? In other >> words, do you know of a way to query the HaitTrust and limit the results to >> items my library owns? --Eric Lease Morgan
Re: [CODE4LIB] Linux OPAC kiosks
Hi Joshua - Interesting work! I took on a tangential project to implement thin-client opacs using linux/gnome sessions a few years ago with pretty good success so it is nice to see some new work here. Other than an internal report that says that the project was mostly successful I do not have much that came out of that work but it was interesting to see that the opac users (largely undergraduate students) had no issues with simple tasks (web-browsing, document printing) and readily adapted to the linux/gnome environment. I had less success with some linux-based thin clients in more robust word-processing environments though (seemed to be an issue with lack of open office familiarity). We actually tried to conduct a user-satisfaction/perception study but found that our students did not even recognize that the environment was different as and such had no positive or negative opinions about the platform. Have you gathered any data from users that would show how people react to these types of platforms? Erik On Thu, Nov 29, 2012 at 3:13 PM, Joshua Cowles wrote: > Hi Code4Lib, > > First post here but I've been following the mailing list for a while and > the Journal and planet.code4lib longer. I just posted a write-up (updating > one previously posted to libraryhacker.org) about using WebConverger to > create OPAC kiosks. I'm hoping to 1) share it with anyone who might find > it useful and 2) hear feedback from others who are interested in Linux OPAC > kiosk solutions. I suspect that some of the people/projects I reference > may be on this list as well, so feel free to chime in. There is a disqus > comment area beneath the write-up: > > > http://blog.jcowles.com/post/36823752885/opac-kiosk-stations-dumping-windows-for-linux > > Thanks & I hope to attend the Code4Lib conference for the first time this > year, so I hope to meet some of you in person soon. > > -- > Josh Cowles > Fond du Lac Public Library >
[CODE4LIB] Survey on Uses of Cloud Computing and Virtualization in Libraries
Please forgive cross posting This research study is about cloud computing and virtualization adoption in libraries. It asks questions about the level of adoption and factors that enable or inhibit the use of these technologies in library environments. The survey is open to anyone who works with IT related to libraries (e.g., systems departments, desktop support, campus IT department supporting the library, etc.). Even if your library does not use cloud computing or virtualization technologies your input is still valuable for understanding the landscape of this technology adoption in libraries. To take the survey please follow this link https://uncodum.qualtrics.com/SE/?SID=SV_6mmaLbFa2El3trK Erik Mitchell Assistant Professor College of Information Studies University of Maryland College Park
Re: [CODE4LIB] Survey on Uses of Cloud Computing and Virtualization in Libraries
Hi Jason - Thanks for your feedback. I agree, it is difficult to boil down complex IT systems into a simple matrix but I tried to design the question in a way that would be accessible to the widest population. Please do complete the survey as much as you can and I would love to connect with you to follow up if you are willing. Thanks, Erik On Fri, Aug 26, 2011 at 10:25 AM, Jason Stirnaman wrote: > Hey, Erik. I'd be to happy to complete the survey but I feel I should let you > know that it doesn't jibe with our environment and we're probably not the > only ones. We have a mix of support staffing scenarios. Nearly everything is > virtualized now. In some cases, campus IT spins up a virtual server > specifically for our use and we manage it from there. In other cases, the web > site for example, they manage it completely. In other cases, we have things > hosted in the cloud with varying levels of management responsibility. As I > began the survey I got stuck immediately and question what value and how > accurate my response would be. > It's just not as clear-cut as the survey assumes. > > Regards, > Jason > > > Jason Stirnaman > Biomedical Librarian, Digital Projects > A.R. Dykes Library, University of Kansas Medical Center > jstirna...@kumc.edu > 913-588-7319 > > >>>> On 8/25/2011 at 02:04 PM, in message >>>> , Erik >>>> Mitchell wrote: > > > Please forgive cross posting > > This research study is about cloud computing and virtualization > adoption in libraries. It asks questions about the level of adoption > and factors that enable or inhibit the use of these technologies in > library environments. > > The survey is open to anyone who works with IT related to libraries > (e.g., systems departments, desktop support, campus IT department > supporting the library, etc.). > > Even if your library does not use cloud computing or virtualization > technologies your input is still valuable for understanding the > landscape of this technology adoption in libraries. > > To take the survey please follow this link > https://uncodum.qualtrics.com/SE/?SID=SV_6mmaLbFa2El3trK > > Erik Mitchell > Assistant Professor > College of Information Studies > University of Maryland College Park >