Infopolecon
Jeremy Bentham coined the term 'panopticon' in his proposal for a circular prison, whose cells were exposed to a central well in which the warders were located, allowing the prisoners to see all other prisoners and to be observed at all times without ever knowing when they were being watched. Bentham also promoted the idea of political economy as the greatest good for the greatest number. Drawing on his terminology as the basis for this article, I therefore propose the new term 'infopolecon' to describe the political economy of information.
I start with a problem. What do I mean by the term 'information'? The word is used to describe the objects on which data is coded and structured, the codes and structures themselves, the resources, and the process by which users of these resources use them. In itself it is now so general as to be of no use to us at all. I want to limit discussion to a specific class of problem - information systems design. By looking at the design, we can formulate an information practice which shows a political economy.
Let us take an undergraduate. Any undergraduate will do. There are about a million of them in Britain at the moment. Each undergraduate undertakes a module in which there is an assignment. They do about ten of these in a year. An assignment starts with a problem definition which defines a search space, an information deficiency, which has to be satisfied by the writing of a document. This is called learning. They perform this process within an infrastructure called the world of learning. Their teachers populate the search space with documents which they themselves have written, their careers patterned by the relative influence or frequency of these documents.
All of these documents now are electronic, virtual, and digitised at beginning and at end, though they might go through various transformations, including appearing as ink squeezed into dead trees. Some of the documents are understood to have a beginning and an end, some to be in collections assembled according to rules, known or unknown. Some are called books, some journals. Some are called libraries. Others are files, and others, URLs.
Part of what the undergraduate has to learn is how to navigate this search space. But where are the maps to this world of learning? Where the blueprints? At a more primitive layer still, what is the physics, the chemistry and the biology of this world? What are its atoms and its particles and its organisms? How does our undergraduate understand it?
Then, let us take a university. Any university will do. There are about a hundred of them in Britain. They cost about ten billion pounds a year to run, and they hold all together about a million undergraduates, each producing around ten documents a year. This is the organisational form which the world of learning takes. In each university there are teachers, who structure courses, set and mark assignments, write teaching materials, produce papers, referee one another's papers and edit journals in which these papers are published.
In any one university, or in the world of learning as a whole, how are these papers organised? And for the production of one paper, as a process, how is it systematised? By re-engineering the process of systematisation, can the unit cost per activity be reduced?
But the product is not the physical form, the paper - it is the activity, the learning, of which the written assignment is only a representation. The organisational form these documents coagulate around either hinders or facilitates learning. The question is which? And if we can tell which, can we then improve the process?
Three elements are missing from the world of learning modelled thus so far. The first is the publishers. Some were established by the universities and are owned by them, some were speculative enterprises intended to generate a profit for their owners. They risk capital in order to produce a document, a print run. They will realise that risk through sales and revenue. When hot metal presses and compositors and galleys were involved the capital and the risk was high. Now there is none.
The second is the secondary publisher, the publisher of 'information' about what publishers have published.
The third element is the library. In the university this was from the beginning a collection of documents shared by the university as a whole, rather than the collection which belonged to an individual teacher or student. It produced its own type of professionals who knew how to organise it as it grew. And in some sense they could organise a community of libraries where the sum of the system as a whole was greater than the sum of its parts. The libraries produced catalogues, so that the reader could know not only that the document was, but where it was.
Yet libraries, which provide the publications of primary and secondary publishers, still do not give learners all they need. What is missing is aboutness, for both reader and document. What is the reader about? What do they need to know? And what is the document about? How can it satisfy the information deficiency? And where to look, since the virtualised, digitised electronic collection of all documents which now constitutes the world of learning has not the concrete form or appearance of a library, a catalogue, a shelf and a book? So it is not surprising that our undergraduate might be disoriented.
How have the denizens of this world of learning responded to this virtualisation? How are we constructing this new virtual reality so that learning might occur?
Well we, the teachers, are carrying on much as before, writing papers, producing courses and marking assignments. Precious little has changed on that front, which is surprising. What has changed, though, is the number of students per teacher, which means that the proportion of the teacher's time given to each student has fallen. In fact, such is the massification that individually students hardly exist at all.
How has the library changed? Surprisingly little too. It still spends most of its budget on books and journals which are stored on shelves.
Nor have publishers changed in any significant way. They are still printing books, journals and catalogues. The prices of student textbooks have fallen, the quantity of print increased. Colour has been added with falling prices. New media have arrived, though there are few truly multimedia publications.
The change is in the network.
When computers were first built they were big and expensive - though in their time so too were books. But where books were owned by scholars and universities, paid for out of their own funds almost from the beginning in this country, computers were paid for by a central organisation of the state. As computers were linked together, networking was provided out of top-slicing the budget of the universities of Britain as a whole. A most uncharacteristic British thing. And they had to intercommunicate, which meant standardisation. This was driven by the Computer Board which allocated funding bid for from the Treasury.
The internetworking was enabled by a suite of protocols called the Joint Academic Network Colour Books onto which was layered a transition to the Open Systems Interconnection strategy required by European Commission directive. Meanwhile the US had DARPANET and TCP/IP. The Computer Board subsequently became the Joint Information Systems Committee (JISC) ,the club of Directors of Computer Centres called itself the Universities and Colleges Information Systems Association (UCISA), the Joint Academic Network became the United Kingdom Education and Research Networking Association (UKERNA), and X.25 became TCP/IP. But the teachers have not been involved in any of this at all.
Since academic networking began, there has been a tension between the running of the network itself, and what it was for. My experience of trying to raise issues in regional and national JANET user groups is that there is an endless fracture between conduit and content.
The Follett report, which started from libraries and the impact of information technology, rather than from questions of what higher education is for and how it should be delivered, was disappointing in its narrowness and lack of imagination (though I must declare an interest as I had been commissioned to write a paper on the information economy). The development of BIDS, NISS and then the deal with ISI were all precursors of what was to come. At every stage expediency was the excuse for short-term fixes followed by institutionalisation of the result.
And from this has grown the JISC datasets policy, the eLib programme and the whole horror show with which we are now confronted. For, whatever the infelicities of the actual deals with the dataset providers, the added complexities of both nationally-provided services and local organisation (where power struggles are now common between the managers of computing and library services) mean that today's undergraduate has very little grasp of how to perform an information search or answer a question.
The problem is not confined to higher education. The British Library recently produced a gimmicky interface to its Catalogue of Printed Books, while it still has no plan for the digitisation of its vast collection.
Meanwhile TCP/IP with HTML begat Mosaic, Mosaic begat Netscape, and Netscape with Z39.50 produced Common Gateway Interfaces (CGIs). But the nationally-provided services were slow to take up the opportunities offered by the Internet and so each university implemented library-based solutions in different ways, resulting in the worst of a centrally-organised top-sliced resource, and a locally-provided idiosyncratic one.
Our predicament is one which I have called elsewhere 'Mrs Thatcher's handbag modem'. We are the victims of the state's taxing and top-slicing, and our local institutions' multiplicities of management layers. The state brought us 'uk.ac'. Local organisation brought in a proliferation of CD-ROMs, databases, licence agreements, passwords and log-on procedures.
In large part the fault must lie with the Librarians of the universities who have had for many years an organisation, SCONUL, which would have been capable of exercising considerable muscle against the publishers.
But in larger part the fault must lie with the funding agency which has the power of the control of the finances of the system as a whole. The absurdity of the Research Assessment Exercise (RAE) is clear to most of us. But imagine if the RAE input had been URLed, as I suggested in a paper to the JNUG in 1994. There would then be a major research database as an output of what otherwise has been a huge waste of resource.
The fault must also lie with the teachers and professional associations which have played little part in the process at all. There has been a failure of analysis, of engineering and of theory.
Returning to our world of learning, where do we go from here with these ten million design exercises? I think it is still possible for higher education to form a power block to force through the implications of the new information and communication technologies. It is possible for the one hundred universities to create better learning environments. In the meantime information systems designers have been given a rich laboratory of examples of what not to do and how not to do it.
What we have created, then, is an infopolecon in which in every cell the learner can see everything and everywhere, but has no structure to understand what any of it means. As virtualisation and digitisation accelerate, the need increases for concrete codes, structures and notations. These have to be taught. This is where we must start.
Response from Jon Knight to this article
Let's play spot the inaccuracies and confused bits, shall we? Here goes:
> All of these documents now are electronic, virtual, and digitised at
> beginning and at end, though they might go through various
> transformations, including appearing as ink squeezed into dead trees.
Not all documents are "electronic, virtual and digitised at beginning and at end". There's still an awful lot of stuff that still only exists on dead trees (my publishing chums make no bones about still prefering cut'n'paste of real photos to manipulating image files). So chalk that one up as factual inaccuracy #1.
> Part of what the undergraduate has to learn is how to navigate this
> search space. But where are the maps to this world of learning? Where
> the blueprints? At a more primitive layer still, what is the physics,
> the chemistry and the biology of this world? What are its atoms and its
> particles and its organisms? How does our undergraduate understand it?
What's this bit about chemistry, physics and biology all about? And are there any answers to these in the paper? Or even attempted answers? Because if there are I missed them. Towards the end the article bad mouths JISC, Elib, Librarians, teachers, professional societies and, erm, most everybody else in HE (except the poor students thankfully) but it then just says (paraphrased) "oooh, we could do good stuff using IT in HE". Maybe I missed the thread somewhere along the way?
> Yet libraries, which provide the publications of primary and secondary
> publishers, still do not give learners all they need. What is missing is
> aboutness, for both reader and document. What is the reader about? What
> do they need to know?
Often the readers don't really know this themselves so knowledge illicitation for the information system could be a tad tricky.
> And what is the document about? How can it satisfy
> the information deficiency? And where to look, since the virtualised,
> digitised electronic collection of all documents which now constitutes
> the world of learning has not the concrete form or appearance of a
> library, a catalogue, a shelf and a book? So it is not surprising that
> our undergraduate might be disoriented.
Erm, isn't that what OPACs, robot generated indexes, quality assessing SBIGs and all the other fun stuff we're all working all about? And are undergrads really that disoriented? The undergrads I come into contact with don't seem too disoriented and indeed many are keen to use IT (which is more than can be said for library services as a whole. Many undergrads seem scared of libraries and librarians, much of which I put down to bad/non-existent experiences in school/public libraries before entering HE).
> Well we, the teachers, are carrying on much as before, writing papers,
> producing courses and marking assignments. Precious little has changed
> on that front, which is surprising.
Hmmm, what about the staff that are making their lecture notes and reading lists available over the net? What about teaching staff that are generating a whole new set of course material and teaching aids based on web technologies in order to support distance learning (as well as enriching the learning experience for students at the institution)? What about staff and students who use personal email, mailing lists and USENET groups for discussing coursework and related topics? Factual inaccuracy #2
> How has the library changed? Surprisingly little too. It still spends
> most of its budget on books and journals which are stored on shelves.
Which is usually what the departments have asked for. Lots of stuff is still only available on bits of dead tree unfortunately.
> Nor have publishers changed in any significant way. They are still
> printing books, journals and catalogues.
This is probably because the old publishers are still running a bit scared; don't forget that the web is still relatively new technology and they're still prodding it to work out how (if) they can make money from it. At the same time millions of new publishers have appeared, putting their papers, books and programs on the net for anyone to pickup freely. So publishing has changed.
> The prices of student textbooks
> have fallen, the quantity of print increased.
They have? I must have missed the bargain bucket in Waterstones or Dillons then...
> When computers were first built they were big and expensive - though in
> their time so too were books. But where books were owned by scholars and
> universities, paid for out of their own funds almost from the beginning
> in this country, computers were paid for by a central organisation of
> the state.
Hmm, this should carry a "broad generalisation" warning. For one thing, lots of early machines were paid for by the State for military reasons and still others were funded out of departmental or personal coffers. However by the late 50's/early 60's, when many of the old Universities were contemplating building or buying their first machines, lots of things were being centrally controlled and purchased by the State. Of course the Universities were taking more and more students that would previously not have been able to afford to go to into HE, so its swings and roundabouts.
> Since academic networking began, there has been a tension between the
> running of the network itself, and what it was for. My experience of
> trying to raise issues in regional and national JANET user groups is
> that there is an endless fracture between conduit and content.
That's true but that's because running the network was (and in some cases still is) in and of itself a research issue. From my point of view, the tension is between network research and running a production network. The production network is there to support other, non-networks research and, recently, teaching.
> The Follett report, which started from libraries and the impact of
> information technology, rather than from questions of what higher
> education is for and how it should be delivered, was disappointing in
> its narrowness and lack of imagination
Really? I would have said that the report was something that was desperately needed; until it was written there was no national programme specially for electronic library development projects. Sure, BLRDD did the best they could but they just didn't have the financial clout that the Follett Report generated.
> And from this has grown the JISC datasets policy, the eLib programme and
> the whole horror show with which we are now confronted.
Do I take it that Mr Lindsay isn't too chuffed with centrally funded datasets and a national programme looking at how we can best use IT in libraries? Without BIDS and EDINA every institution would still be negotiating its own contracts with the dataset providers. Without Elib we'd be stuck with doing a few local projects, a few BLRDD/Research Council funded projects and stuck with making do with products designed for commercial desktops rather than academic libraries (we still have lots of the later, but at least Elib is providing a few rays of community developed hope). There may well have been problems with both but at least they're there.
> The problem is not confined to higher education. The British Library
> recently produced a gimmicky interface to its Catalogue of Printed
> Books, while it still has no plan for the digitisation of its vast
> collection.
Hmm, if "gimmicky interface" is a webopac (or similar) then the development effort is likely to be _considerably_ less than "digitisation of it's vast collection". I'd rather have some decent services now rather than wait for the Perfect Digital Library(tm). Short term think _is_ network thinking because on the web real time is compressed into ever shorter "web years". What's the cool, in, happening technology today is commonplace next month, old hat in six months and obsolete in a year. I hope that the BL are carefully weighing the best approach to digitisation whilst treading carefully through this technological minefield (and dealing with copyright issues, etc). Good job that Elib is undertaking some small scale digitisation projects (such as the ACORN project here at Loughborough) which will test the water and highlight some of the problems that the BL can expect.
> Meanwhile TCP/IP with HTML begat Mosaic, Mosaic begat Netscape, and
> Netscape with Z39.50 produced Common Gateway Interfaces (CGIs).
Excuse me? "Netscape with Z39.50 produced Common Gateway Interfaces (CGIs)"? I don't think so. CGI was developed by a group of different HTTP server authors lead by Rob McCool at NCSA. This was way before Netscape was a gleam in Jim's eye. CGI has little to do with Z39.50 (aside from Isite's Z39.50 CGI gateway) and Netscape has even less to do with Z39.50 (try typing a z3950r URL into your Netscape browser. Does it work? I doubt it). Major fact deficiency here me thinks; factual inaccuracy #3.
> But the nationally-provided services were slow to take up the
> opportunities offered by the Internet and so each university implemented
> library-based solutions in different ways, resulting in the worst of a
> centrally-organised top-sliced resource, and a locally-provided
> idiosyncratic one.
Many of library systems were originally deployed well before the DoDAG and the Shoestring project looked at the provision of TCP/IP on JANET. For example BLCMP was set up in the early 70's when even the original ARPAnet was a tad on the under developed side and the Coloured Book based JANET wasn't around. Different sites had different systems because different sites had different needs that were met by the different products. These were the days before Microsoft and the One True Gatesian way. We didn't know where we wanted to go tomorrow (and in some cases still don't. Such is life).
> Our predicament is one which I have called elsewhere Mrs Thatcher
> handbag modem.
I always knew that Mrs T was a hip and wired funster at heart. Now I discover that she had a pocket modem in her handbag and my gut feelings are confirmed. :-)
> We are the victims of the state's taxing and top-slicing,
> and our local institutions' multiplicities of management layers. The
> state brought us 'uk.ac'. Local organisation brought in a proliferation
> of CD-ROMs, databases, licence agreements, passwords and log-on
> procedures.
Local organisation has to deal with the demands of the teachers that ask for odd products and of publishers that each demand different licensing arrangements (which in turn lead to different user interfaces, etc). I agree that CD-ROM user interfaces often stink but the publishers are often deaf to the problems, either because they have poor communication with the libraries using their products or because they realise that the libraries have to put up with what they've already produced because there are no alternatives (or the academics demand a particular product).
I must admit to being a tad confused by now. Mr Lindsay doesn't seem to like the idea of top-slicing for national developments like SuperJANET or the JISC datasets. At the same time he doesn't appear to like locally developed idiosyncratic systems. I wonder what he would like? Surely not completely decentralised information provision by the teachers themselves as that really would end in chaos.
> In large part the fault must lie with the Librarians of the universities
> who have had for many years an organisation, SCONUL, which would have
> been capable of exercising considerable muscle against the publishers.
What form does this "considerable muscle against the publisher" take exactly? Would they have threatened to withhold purchasing from publishers that didn't capitulate and provide digital copies (which I assume from the subtext is what Mr Lindsay is after). I bet lots of the publishers would have been quaking in their shoes at losing 100 purchases from libraries when students will purchase thousands of copies. Not to mention the hoards of ****ed off teachers that would be pounding on the Librarians office doors demanding to know why the cash that their departments paid to the library wasn't being used to buy the books that they wanted their students to read.
> The fault must also lie with the teachers and professional associations
> which have played little part in the process at all. There has been a
> failure of analysis, of engineering and of theory.
Teachers are interested in teaching. It's only in the last couple of years that the Net has been friendly enough to present to non-technical users for teaching purposes. It's also only recently that having IR and IT skills have been something that is worth training everyone in. Some teachers _are_ getting involved now (and have been for some time) and we're already seeing systems based on this (both funded by programmes like TLTP and also developed locally by interested individuals and departments). And as Mr Lindsay points out, teachers now have more students to deal with and so they surely must have less time available for contemplating information systems.
> Returning to our world of learning, where do we go from here with these
> ten million design exercises? I think it is still possible for higher
> education to form a power block to force through the implications of the
> new information and communication technologies. It is possible for the
> one hundred universities to create better learning environments. In the
> meantime information systems designers have been given a rich laboratory
> of examples of what not to do and how not to do it.
Isn't creating "better learning environments" what we're all doing? Isn't that what elib and SuperJANET and all the other neat technology we get is there for? Isn't learning how to building the information systems that HE needs itself an exercise in continuing learning that we all take part in?
> What we have created, then, is an infopolecon in which in every cell the
> learner can see everything and everywhere, but has no structure to
> understand what any of it means. As virtualisation and digitisation
> accelerate, the need increases for concrete codes, structures and
> notations. These have to be taught. This is where we must start.
I have absolutely no idea at all what this paragraph means.
Response from Graham Jefcoat to this article
Far from having no plans, The British Library has recently announced its Digital Library Development Programme, to be led by the Research and Innovation Centre. This Programme entails establishing digital information services based on the content in the British Library's unparalleled collections and developing the capabilities to work with these collections in new and exciting ways. The aim is to improve access for all its users. More information on the Centre's web pages at: http://portico.bl.uk/ric/digilib.html