Back to the moon - eLib and the future of the library.
19th January 1996, and founding editor John Kirriemuir is about to hit “publish” on the first edition of Ariadne magazine. In a bunker somewhere in the East Midlands, Jon Knight waits with trepidation to see what the Ariadne editorial process will make of the first in what would prove to be a long running series of From the Trenches articles. Little realising that twenty years later he would himself be a member of the editorial cabal...
Follett FIGITs
Wind the clock back a little further, to 1993, and the Follett Report. Follett recommends what amounts to a “moonshot” for UK university libraries. It was already clear that technological change would have a profound effect on teaching, learning and research, and that libraries were poised to play a pivotal role in this - or would simply fade away into obsolescence. Follett states that:
The exploitation of IT is essential to create the effective library service of the future.
In 1995 the wonderfully acronymed Follett Implementation Group on IT (FIGIT) is established and creates the UK electronic libraries programme, or eLib for short. eLib was chartered to pursue a programme of radical innovation that included:
...development of standards, pilot projects to demonstrate the potential of on-demand publishing and electronic document and article delivery, a feasibility project to promote the development of electronic journals in conjunction with relevant publishing interests, the development of a database and dataset strategy, investment in navigational tools, retrospective conversion of certain catalogues, and investment in the further development of library automation and management systems.
eLib ran from 1995 to 2001, and chartered around 70 digital library R&D projects - truly letting “a thousand flowers bloom”. I was lucky enough to play my own small part in the eLib programme, co-authoring the software underpinning the eLib subject gateway services. The subject gateways provided a curated list of quality resources in the Wild West that was the (pre-Google) early web. I described this period and my work on and around eLib in a 2011 blog post: Back to the future - resource discovery, revisited.
Life after eLib
Many of the themes that eLib supported which have really taken on a life of their own in the intervening years. For example, take a look at the Open Journal and CogPrints projects from Wendy Hall, Les Carr and Steven Harnad at the University of Southampton, which neatly map out much of the open access, institutional repository and open journal space that is rapidly becoming the norm for researchers worldwide. Like CogPrints, Thomas Krichel’s working papers in economics (WoPEc) project at the University of Surrey lives on as part of Research Papers in Economics (RePEc). I am particularly fond of RePEc, which is built on the metadata format that we used in the long defunct eLib ROADS project.
And at Jisc we took forward a number of the eLib projects as national services - like Netskills and DIGIMAP, and the subject gateways were ultimately brought together as the Resource Discovery Network, which became part of the Intute portal. But in many ways the most significant part of eLib for librarians might actually have been the Ariadne magazine, which rapidly became the place to look for the latest ideas and advice on harnessing the potential of technology. This issue of Ariadne, number 75, comes out on the magazine’s 20th anniversary.
Back to the moon
Looking back at the eLib initiative now, it’s clear that in many cases we were years if not decades ahead of our time. For example it is only very recently that open access journals and open research data have come to be regarded as standard operating practice for publicly funded research - with the RCUK Policy on Open Access and equivalent mandates from funders such as the Wellcome Trust and the Bill and Melinda Gates Foundation.
However, we also failed to anticipate the extent to which digital systems and services would come to be consolidated under the auspices of a few large players - from the Stacks, as Bruce Sterling calls the major Internet firms, to a handful of conglomerates that dominate academic publishing, much like the zaibatsus of William Gibson’s Neuromancer. Perhaps this tension between the Stacks (e.g. cloud providers and publishers) and the individual or institution will come to be the defining characteristic of the next twenty years.
A new moonshot?
We don’t have a “new Follett report” - perhaps the Nurse Report is the closest we get to one, and its recommendations are about efficiency and consolidation more than they are about radical innovation. But that shouldn’t stop us from thinking about the future. What ideas would seem radical now, but could easily be standard operating practice by 2036?
Google’s Moonshot Summit in the Summer 2016 brought educators and technology innovators together from around the world to consider the future of technology enhanced learning. Let’s now ask ourselves - what would a Moonshot for Libraries look like? A “new Follett” for the twenty tens and beyond.
Not only are public libraries closing at a fast rate of knots, but it’s also becoming increasingly difficult to train as a librarian due to library school closures. This seems paradoxical given that we are also told that 90% of the world’s data was generated in the last couple of years. I believe the answer is that we actually need librarians more than ever to help us find and make sense of all that information. And in academia there is a particular need for librarians to help open up research outputs and teaching and learning materials to ensure they are reused, revised, remixed and redistributed. The Content Mine, from Peter Murray-Rust and his collaborators, offers us a glimpse of the transformational potential of open science.
Plan B - the offsite backup
It sometimes takes an outrageous or plainly ridiculous idea to spark a conversation that leads to real insights. In that spirit I will share an idea with you today in this article - right now the sum total of human knowledge is only to be found on the Earth, but the Earth is fragile and our tenancy upon it far from assured. Do we not owe it to future human generations, or future interplanetary explorers who discover Earth’s smouldering remains, to have an offsite backup?
Just a few years ago this would have been completely preposterous, but consider that an offline copy of Wikipedia (I know, but we have to start somewhere!) is a mere 12GB in size - easily copied onto a thumbnail sized microSD card. I’d like to think that Tim Peake took a copy up to the International Space Station as part of the Astro Pi project, but sadly I think that opportunity was missed. Now if that microSD card was a nice big 128GB one, there would still be room left for 50,000 ebooks from Project Gutenberg. Whilst an offplanet outpost holding the archive is quite appealing (if only because I am picturing “Moonbase Jisc” at this point) the truth is that we could easily send all that information out with every space probe that Earth dispatches - like a latter day version of the Voyager Golden Record. We could even bury it deep below the surface of the moon as part of the Lunar Mission One time capsule.
Of course the sum total of human knowledge is a little more than this, and some curation effort might be required to decide which of the 459 billion web pages indexed by archive.org is worth keeping a backup copy of. And that’s before we’ve started to look at cat videos and human DNA. The serious point here is that we tend to think of the scientific record and educational materials as big data, but they are a drop in the ocean by comparison with the ordinary person’s “digital exhaust” of photos and videos in an era of smartphones and tablets. That offsite (off planet) backup of humanity’s key insights might not be such an impossible or implausible idea after all, and making it might teach us a lot about what is truly important.
So that’s my moonshot - what’s yours?