Ive been so busy of late, more on that later, however, I have had the pleasure of being able to liveblog guest speakers Karen Schneider and Lizanne Payne, visiting us for the 30th Anniversary of VALA and CAVAL, with the theme of 30 years of Looking Ahead.
Karens talk was entitled Open – looking at how open source has long been a part of the library profession. In the late 1800s it was the creation of ALA. Many more examples followed, which I lost when I mistakenly deleted what I had already typed into ScribeFire and couldnt get it back. Heres where I picked it up again. 1935 – talking book collections established. 1939 – The introduction of book covering enabled libraries to share and market books to the public. 1976 – Copyright law in US. 1977 – library departments began writing their own automation systems, after 100 years of innovation in libraries. 1978 – AACR2.
In the 1980s, libraries moved to learned helplessness, where they moved to vendors providing their automated systems. In the direction of open however, in the mid 80s, GNU was developed, in the early 1990s Linux began development.
This set the state for Evergreen. Developed by the Georgia Public Library Service, in response to the Y2K issue, built a catalogue to serve the Pines Consortia which comprised all but a few public libraries in the state of Georgia – 258 libraries altogether. Initially, they were looking to purchase a system to support the large demands of their consortia. Initially happy with their choice of ILS, after a few years, found that it was not keeping up with the changing libraries and changing users, both in terms of the size of the consortia and the capabilities of the system.
They decided in 2004, that they would write their own package. They were criticised, forgetting that librarians had been innovators for the previous century. It took 2 years to develop. Its free to use, download and share and libraries are doing so, some without the support of organisations like Equinox.
Its software written by librarian, for librarian. In 2008-09, it is live in over 300 sites, including some international and covering consortia, single library services, hosted sites, academic, public, special and more.
OSS in real life? Perception is that it is only a last option choice, it is not mature enough, the cost is deceptive. OSS is liable to rapid application development, which is generally true because there are multiple developers out there working towards solutions. OSS is easy to customise, although its interesting that the customisation requests from libraries are often for things that other libraries would also want.
Partnerships have been developed with 3rd parties in an open environment – so the focus is on the service, not on the proprietary code. OSS has interoperability, adding other modules and software, because the package has been developed on open standards. OSS not great in general on documentation – takes back stage as the developers generally know what it is and forget that other people need it. Has been a problem for Evergreen, but one that is being resolved now with a dedicated team of people writing the documentation as we speak.
GIft Economy – the development group has been small, with a very limited group of library software developers, but as more libraries come on board, this group is growing.
When libraries handed over the reins of automation to vendors, we removed ourselves from the design of such systems. We bought the packages and then grumbled about it. Librarians have great ideas for their ILSs, but those ideas rarely come to fruition in those same ILSs.
Best way for librarians to find ways to improve their ILSs is to use them prolifically as a user, not as a staff member. That way you can truly have the library experience, whilst keeping an eye to how it can improve.
Intrigued by the Biometric lending option utilised by one small public library in Georgia – for those people who consistently forget their library cards.
After afternoon tea, Lizanne Payne spoke on the Future of Library Collections: access and stewardship in a networked world. Lizanne is the Executive Director of the Washington Research Library Consortium.
Until about 40 years ago, libraries were local centres of learning, where the aim was to gather as many resources together in one place, as possible. We still attribute higher value to libraries with the greater number of volumes, even though our value goes beyond this now.
In the 60s, our resources were accessible through the joys of the oak drawer encased card catalogue. In the 80s, the online catalogue, meant that you could at least find out what resources were available in your library, before physically entering it. Now, our resources are electronic, available anywhere, anytime, but we also remain custodians to our physical collections. Lizanne believes that libraries are becoming more global and that we are within 10 years of being system wide repositories.
Trend: libraries as place – they are for people. They are moving from places to house books to places to host people. New spaces are for users, not for books. Print holdings are moved to less accessible parts of the buildings and the focus is on the user and the electronic.
Trend: electronic journals and books are viable alternatives. Vast majority are available in electronic formats.
Trend: campus attitudes towards libraries are changing. Only 10% of users start with the library building to start their research. Only 25% started with the library catalogue. Of faculty, 50% viewed the library gateway function as very important – for librarians its 90%. 🙂
In increasingly valuable campus space, the justification for unused print resources taking up this space is being questioned. 35 million volumes in Australian academic libraries at present (OCLC stats). The numbers are not declining, as titles are still being bought and needing to be stored. Space is being reclaimed in the main library, by utilising high density facilities – usually offsite, to manage the less used resources the library has.
Harvard model storage facility – volumes stored by size for maximum density and hold up to 2 million volumes per 4000sq.mts, cherry picker for retrieval, usually off site, scheduled delivery with a construction cost per volume of approx USD $3. Typical retrieval is 1-3% per year.
Automated storage and retrieval system – volumes stored in metals bins, retrieved by robot mechanism, can hold over 1 million volumes per building module, built on campus, delivery in minutes, construction cost per volume approx USD $10. Some libraries are putting a hugh proportion of their collection in such a facility, as it aims to have it as quick to collect as if the user had to go to the shelves to collect it themselves.
Shared Storage Models: Shared secondary storage for multiple library services, with no collection sharing – separate space within the same building. Shared or last copy storage – where ownership changes to the consortia when item is put into storage.
Print journal archiving: Prospective archiving is where the print edition of an electronic subscriptionis sent to storage for archiving. Digitizer dark archives – print editions are available for rescanning in case something happens to the digital archive.
Bright Archive is a consortia of Australian university libraries working on an agreement to share and archive resources.
US Research Reserve – aims to safeguard the long term future of printed research journals, can access copy at the British Library, with 2 other libraries holding backup copies – other copies can be withdrawn. Got central funding for deduplication and have continuing funding for deduplication and too develop systems, Project goal – 100km of free space across academic libraries in the UK.
Mass Digitization will have a profound affect on how we retain print copies locally. Google Book Search is mass scanning from major libraries without selection, in copyright works shown as snippet, full image for out of copyright, new feature of library subscriptions to full-text (may be free for public libraries), MARC records going itno OCLC API for one-off search, millions of books scanned although exact number not known. Also Open Content Alliance, scanning out of copyright titles and the Hathi Trust aiming to develop a long term digitisation program to protect these materials, in case the other projects break down and disappear.
Local scan to build e-book collections is being used by Emory University. Machine scans the book and then they are available for purchase from Amazon. CAVALs Carm Centre is doing something similar.
Networked print on demand printers have become small enough and financially viable for some libraries to take on. University of Michigan prints from Google Books, Open Content Alliance or other digital books.
Evolving library ecosystem – electronic content will be even more ubiquitous resulting in print repositories serving the greater network – holding non-common, unique titles, bound journals etc.
Our focus over the next decade, will be finding the balance over how we retain items centrally and locally and how we manage our collections within and between library services.
Moving to a planned redundancy model – need to plan for a certain amount of copies be kept by libraries, so that with the pressure on libraries, we dont come to a point where there are no copies left. Yano study determined the minimum number should be 13.
Access and stewardship model for the future is now the just in time being prolific and the just in case being the backup.
However, dealing with the politics, systemic needs, local needs, administration and more will be the biggest challenges into the future. Lizanne doesnt have the answer to these issues, but putting them out there for discussion is a good first step.