Editorial Introduction to Issue 56: More Light Than Heat
I am greatly indebted to Gráinne Conole for a number of reasons. It has been my intention for some time to commission something from the OU in respect of learning technologies given the wealth of expertise that resides there. For a variety of reasons it has taken me a while, but the wait has been more than worthwhile in the light of Gráinne's contribution. In my view her article New Schemas for Mapping Pedagogies and Technologies does much to exchange light for all the ambient heat that surrounds this topic, while refusing, as some are tempted, to reject the whole Web 2.0 development as 'blether'. Instead Gráinne provides a reasoned and structured path through this topic which will give considerable food for thought to her fellow practitioners. In this balanced article she freely acknowledges the tensions that exist 'between the rhetoric of Web 2.0 and current educational practices.' She also describes in this context some of the chief characteristics of Web 2.0 technologies and how they can place new users at variance with current educational thinking. She goes on to point out that the traditional educational and knowledge model has changed, while Web 2.0 in effect helps resolve the difficulty: '... It is no longer possible for any individual to be an expert in their field, .... Web 2.0 by its nature copes seamlessly with a complex and changing knowledge domain; fundamental to Web 2.0 practice is that no one individual is expert, rather they are part of a social network of others; the power of the Web enables knowledge to be co-constructed and hence continually change as needed.' Yet despite these tensions and changes Gráinne points to the fact that, 'there has never been a closer alignment between the current practices of Web 2.0 technologies and what is put forward as good pedagogy ...'. Thus, to my own considerable relief, she turns the all too frequent approach on its head and shows how the primary and educational need can shape and make use of the benefits these new technologies confer.
Before anyone 'detects' a themed issue, I should point out that not even two swallows make a summer; consequently the contributions from Marieke Guy and the LibrarianInBlack, Sarah Houghton-Jan in this Summer Issue can only form a 'tendency', namely looking at the information professional's 'lot', happy or otherwise. In this regard, if Marieke addresses the notion of 'where' in her contribution then Sarah most decidedly delves into an increasingly important aspect of 'what'.
Sarah Houghton-Jan's article Being Wired or Being Tired: 10 Ways to Cope with Information Overload addresses a major concern these days, wired or otherwise. Indeed, she informs us that anxiety about excess of information dates back centuries. Nonetheless, of the old tyrannical triumvirate of the telephone, photocopier and computer, it is the latter which has now subsumed the roles of its co-conspirators and bombards us with email, IMs, VoIP calls et al, ad nauseam (to invoke the Latin, appropriately enough). While Sarah orientates this contribution towards the role of librarians and other information professionals, I suspect many other office-based workers will benefit somewhere from her ten techniques to manage the overload. Not all are either revolutionary or essential to all. Indeed some points are arguably centred on 'mere' good administrative practice. Others might complain that her suggested 'traffic survey' of all 'incoming' only adds to their workload. The latter complaint would appear a little unfair. In order to solve any problem one has to gauge the degree to which it exists - and few would claim they suffer no information overload. Indeed SHJ offers some very pragmatic advice on dealing with floods of email and RSS feeds as well as more recent 'carriers' . Her axe extends to print and multimedia; she advises us to be ruthless for our own sakes. If information overload is the disease, Sarah may very well have a cure. Moreover if such ministrations to the hard-pressed professional fail to help, there is always Lina Coelho's review of What's the Alternative?.
What particularly commends itself in Marieke Guy's article is the approach she adopts throughout. This is neither a pitch on behalf of the workforce to stay at home and compute from the duvet, nor is it an opportunity for employers to complain about the suspicions they may have about remote working. A Desk Too Far?: The Case for Remote Working takes an even-handed approach to the subject and is far from starry-eyed about the prospect of remote working. Indeed Marieke's research makes it clear that remote working is neither feasible nor appropriate for everyone. Such are the personal and organisational issues involved that we have agreed to defer discussion of the technical aspects of remote working to a further article. Given the widely accepted view that while synchronous and asynchronous communications are all very well, but what some organisations need is more 'face-to-face', it might seem perverse to advocate remote working. However, recent developments have not all been technical; the way in which practitioners now work has also altered considerably. Indeed, in one respect,but an important one, it can be argued that this style of working has come to the rescue of the system of employment as it proceeds to make its life ever more difficult with the advent of shorter and shorter contracts and decreasing tenure. It is reasonable to contend that such a trend is inevitable in the continuous attempt to make the funding cake go further. The counter-argument goes that as this trend strengthens, an increasing proportion of say, a researcher's contract is devoted to looking for the next post. Remote working can in these circumstances sustain both recruitment and retention, since it can provide a 'silver bullet' solution to the problems of filling a post - and keeping it filled.
In the last of three contributions emanating from the JISC-funded Version Identification Framework (VIF) this year, Jenny Brace points out in Versioning in Repositories: Implementing Best Practice that, while a recognised problem, versioning has not had a great deal of time or effort devoted to it as yet. She produces statistics from a VIF survey conducted among researchers in autumn 2007. Yet developments in institutional repositories have only served to exacerbate the problem and Jenny seeks to provide some clarification amidst the confusion, as well as some pointers to good practice for all concerned. In the framework she proposes, she emphasises the importance of answering the needs of individual repository managers. The individual approach nonetheless comes underpinned by some globally applicable key principles which will serve to increase end-users' confidence in their repository.
In The Networked Library Service Layer: Sharing Data for More Effective Management and Cooperation Janifer Gatenby offers criteria for determining which data in various library systems could be more beneficially shared and managed at a network level. Janifer provides us with the background story of the development of integrated library systems over the years, including the times when they no longer merited the term 'integrated'. This occurred during the most significant changes in the period after 1998 with the Internet, the World Wide Web and the growth of digital publishing eliciting new systems which took us outside the single library location in terms of resources and ultimately services. With the advent of many different types of data, Janifer contends that we need to review 'the architecture of the ILS for smoother integration into the current environment' and also proffers examples of how the new varieties of data can be of service to libraries and users alike. While she maintains that it is important for libraries to own and control their data resources and manage their access beneficially, she is less convinced that libraries need to be similarly in control of the management software.
In his Get Tooled Up article about Integrating Journal Back Files into an Existing Electronic Environment, Jason Cooper provides a well-structured and step-by-step analysis of the approach adopted by Systems Staff at Loughborough University Library. In his exposition, Jason provides helpful background on the existing electronic environment, what the team wanted to achieve in terms of users' experience and the method elected. His description then takes us through the various elements of the implementation and how the team saved time and effort by integrating the back files into its existing electronic set-up.
in his column on search engines, Phil Bradley freely admits he sees Google Still Growing and even ignores his own restraining order on coverage of Google since he is quite impressed with some of its latest developments. But only some. He warns not to expect that everything mooted in Google's experimentation space will necessarily come to fruition, though it is possible to see what is progressing in Google Labs for example. Where Google most decidedly draws Phil's fire is in respect of Google Librarian which, as he ponts out, had been enjoying a very lengthy summer break only to give way, in the end to a newsletter for librarians. Phil finds it hard to mask his scepticism. He is far more more impressed by the move by Google to index what is variously known as the dark, hidden or invisible Web, basically inaccessible databases. If this works, it will have a significant effect on usage of the Web, though, he warns, it will not come without its problems. Phil covers a variety of other developments, of which I was most interested by the new Google Maps features.
In Lost in the JISC Information Environment Tony Ross offers his personal view of the contribution the JISC Information Environment architecture has made to development, i.e. from the perspective of someone working on tools for resource discovery. In his high-level view of the design of the JISC IE, he seeks to highlight to what degree the IE, as he terms it, 'has existence'. In his overview he points to what he perceives as a gap between what the conceptual model covers and what might be termed as feasible, and between the prescriptive and the descriptive.
I am also indebted to Emma Tonkin for her contribution on Persistent Identifiers: Considering the Options in which she looks at the current landscape of persistent identifiers, describes several current services, and examines the theoretical background behind their structure and use. For readers coming to persistent idetifiers for the first time she offers a working notion of what they do and why they are so useful, including the factors that have driven their design. Emma warmly acknowledges the support of John A. Kunze's feedback on these standards; I well recall John's passionate and engaging address on Supporting Persistent Citation at a UKOLN seminar in December 2006. Of particular interest to me in Emma's article is the technical vulnerability provoked by a human tendency to impose non-technical considerations such as marketing that can defeat persistence - and for what seems very trivial gains.
As usual, we offer our At the Event section, as well as reviews on the visualisation of data, the effects of Web 2.0 on delivering information literacy, blended learning and online tutoring, practical advice on Web Accessibility and, of undoubted interest to many worried practitioners, a review of Keeping Within the Law - the book and the online service. In addition of course we provide our usual news and events.
I hope you will all find much of interest in Issue 56.