Sunday, 23 July 2023

Out of Order


My desk, March 2010

During my span of working life I participated in pretty much the full four-decade story of library computerisation (no, please, don't stop reading yet...). From a standing start in the mid 1970s (i.e. none at all) it all began with the introduction of a few green-screen "dumb" terminals linked to an in-house mainframe computer, dedicated pretty much to the automated reproduction of what had gone on before: the purchase, cataloguing, issue and return of stock. After the arrival of the first stand-alone personal computers and a period of experimentation and innovation we stabilised for quite a few years with a succession of systems made up of dedicated, locally-networked PCs – the "online public-access catalogue" (OPAC) was our Big Thing [1] – until finally (in the last chapter of my story, at least) we were broadcasting everything and anything we had to pocket-sized sci-fi wonders which most people no longer even thought of as computers at all, and had become just one competing component in an entire universe of freely-available information, misinformation, and amusing cat videos. Arthur C. Clarke's declaration that "any sufficiently advanced technology is indistinguishable from magic" had never seemed more prescient.

Looking back, you begin to realise you have been living through an extraordinary period in human history. Well, sure, OK, who hasn't? But, just as my generation's grandparents were born into a world without aircraft and ended up having bombs dropped on them from a great height, so those of us who were not "born digital" are entitled, I think, to feel things may be getting out of hand. Slow down! Can't we just think about all this a bit more?

Along the way I learned to work with a number of now-obsolete operating systems: DOS, of course, but does anyone else remember Pick, for example? (and can there really ever have been a time when the name of its developer, Dick Pick, wasn't hilarious?). I also taught myself to program in several languages, but my main task was to learn how to wrangle the complex blend of technologies employed in a university library, and to "manage" the people using them, who were often resentful, unwilling, or just plain unsuited to the task. Sigh... (There's a famous Norwegian TV sketch about the introduction of the new "book" technology into a monastery that always makes me laugh). Above all, however, what I learned was to dread "upgrades" and – ultimate horror of horrors – the implementation of an entirely new system, especially one using yet another underlying operating system. Which, until Unix had become something of a standard, was every time. Every bloody time, every five years or so. What a nightmare that was. Every time.

Most of us who found ourselves moved sideways (rarely promoted) into a systems management role in the 1970s and 80s ("Congratulations! You're about to become the most unpopular person in the building!") had no training or computer-science background whatsoever. It was very much a "sink or swim" experience, and in the beginning learning to swim was surprisingly enjoyable. There was a pioneering, exciting period – let's say, from about the late 1970s to the early years of this century – when an enthusiastic "systems librarian" (and I imagine the same went for most other office-based industries) could learn how to get under the bonnet of their computerised setup and tinker with it, in the same way it used to be possible to tinker with cars. You figured out a bit of programming, read your way through the manuals, established a relationship with your supplier, and then got "hands on" whenever necessary. It was fun, most of the time; communities and friendships formed around user-group meetings, and many of us discovered latent technical talents; quite a few were recruited into the industry side as developers and go-betweens, able to speak the language of both supplier and customer without too much eye-rolling or sarcasm.

By the time I retired, I was spending much of my time doing what most proper, professional programmers would love to be doing, but can't: just tinkering around, finding problems to fix, and writing Perl scripts to fill gaps in the functionality of our latest, Linux-based library system. My proudest moment was writing an entire suite of CGI programs that managed the transfer of 50,000 books from one library to another, in the process automatically reclassifying each item's Dewey Decimal shelfmark into its Library of Congress classification equivalent, printing a new spine label, and carrying out various other tweaks (changes of loan period, deduplication of stock, etc.). It reduced a massive and daunting task to a simple matter of scanning in the barcodes of a trolley-load of books at one end of town, hauling them across to the other end in a van, scanning the barcodes again, printing and attaching the new labels, and getting them swiftly up to the shelves, ready for use. Repeat until complete. Hey presto!

Sadly, such acts of labour-saving creativity – and there were many – did not earn me the rewards you might expect. Why not? Over the years I had been transformed from a typical humanities airhead, attracted to intangibles and mumbo-jumbo, into a notoriously intransigent, hard-headed number-cruncher. Universities, surprisingly, are full of people who think that believing something is possible is enough to make it so. Few ever bother to quantify a task – the time it takes, how often it must be done to achieve a certain target, how many people it would take to do that, how much those people will have to paid, and so on – before deciding it will be done, and what's more finished by, oh, shall we say Christmas? Those few that do make those essential calculations and are brave enough to deliver the bad news are, inevitably, never popular with senior management, and attract a reputation for "negativity": which is promotion poison, even – especially! – when your efforts have saved various incorrigible Pollyannas from serial self-inflicted humiliation. I learned never to expect gratitude from any ambitious high-flyer.

In the last years of my professional life, however, exactly as has happened with cars, ever more sophisticated products began to arrive on the scene that used the automation equivalent of tinker-proof sealed units, and these were usually a mystery even to their developers and suppliers. Who, but an ultra-specialised techno-geek, knew exactly how a network functioned, or WiFi, or even a smartphone "app", and could therefore fix them when they stopped working? Worse, who would dare admit that they didn't know? Besides, most serious programming had long ago come to depend on libraries of pre-written components – I doubt anyone has bothered to write anything as basic as a sort routine in decades – and these would be assembled together like Lego along with other proprietary stuff bought under license, all in the expectation that the resulting package (actually a Heath Robinson-ish monster beneath its glossy surface) would work under all circumstances, and – crucially – continue to work when one element in the chain of dependencies developed a fault or, argh, was "upgraded". What could possibly go wrong? Everything and anything at any time, that's what.

I retired a decade ago, and things have clearly continued to get even more mysteriously opaque and giddily interdependent, and as a consequence more precarious. Lately it has seemed that, wherever you turn, you encounter yet another automated system that is unreliable, poorly configured, or out of order. More often than not this turns out simply to be because a new system has been introduced, and no-one yet knows how (or whether) it really works. Obviously, a key driver behind automation has always been the saving of money, and there are few activities where the biggest cost by far is not staff wages. To employ technical staff of the proper calibre at the going rate is regarded by most non-technical enterprises as an unnecessary luxury (an impossibility, actually, in public-service contexts where salary scales are inflexibly tied to job descriptions and grades). So, in effect, once installed, a new or upgraded system is placed into the hands of its end users – usually poorly-paid clerical staff [2] – to "debug" in live use. "Hey, look, when I do this, the whole thing freezes!" "Really? We had no idea you would ever do that. This could be a problem... Could you try not doing that?" "But that's my whole job!"

Some examples.

We could start at the top, with the biggest miscarriage of justice in British legal history, the scandalous prosecution of nearly 1000 sub-postmasters for theft, false accounting and fraud, all because of a software system, Horizon, that didn't work properly, and reported non-existent accounting shortfalls. But that's a book-length story, and still ongoing. Suffice it to say that a number of perfectly innocent sub-postmasters were jailed and had their lives trashed entirely on the evidence of a faulty system that Fujitsu, the supplier, and the Post Office, the implementer, insisted until quite recently was not faulty. The idea that quite so many operators of village Post Offices were lying, thieving cheats seems not to have troubled anyone at senior level in the Post Office; quite the reverse, in fact.

Closer to home and with rather less consequence, it recently took me an entire month to make a simple change to a National Trust gift membership I had given to my son. I made multiple phone calls, but no-one ever answered the phone, despite the cosy northern-accented voice that interrupted the muzak every couple of minutes to reassure me that they knew I was there, my call was important, in a queue of unspecified length, and that someone was "on their way to me" (as if there might be a knock on the door at any minute). As so often, this lack of response was blamed on an "unusually high level of customer calls"(in other words, "Look, we're just too busy / understaffed / idle to answer the phone, so why not just go away?"). So I also sent emails, which also got no response. Then, after a week or so I finally did start getting through on the phone, and there were three attempts to make the same simple change by three different "agents", each of whom failed to negotiate their way to a satisfactory conclusion. It was frustrating. Then, replies started to arrive belatedly in response to my earlier emails which both contradicted each other and the "help" given by the people on the phone. Aaaargh! In the end, a customer services person had to admit, ruefully, that it was all really due to the introduction of a new call-handling and membership system which was having "teething problems"... Well, of course! Why didn't you say so? Tout comprendre c'est tout pardonner... At least it is around here; you have my deepest sympathy. Let's hope you don't all end up in jail.

Then there was the medical appointment I had to make at a local hospital. I tried using the online booking system, but it claimed that there was just one appointment available during the coming months, at 8:30 a.m. one day the following week. This seemed highly improbable, so I made the customary odyssey through various switchboards to arrive at the relevant phone on the relevant desk, and – yes, you've guessed! – it seemed they were having problems with a new booking system. I was told that if, by some miraculous stroke of good fortune, I was able to see any appointment at all, then to grab it, and maybe ring next week to see if things had got properly sorted out in the meantime. They hadn't.

So then a short while ago I made my first return visit post-COVID to my old place of employment, the library of the University of Southampton. My "retired staff" identity card failed to open the turnstile. Huh? It seemed "iSolutions" (the university's utterly unironic name for its computing services and support operation) had mistakenly deleted me from the database at the same time as they shut down my old email account. No worries: my details were noted and would be passed through to be restored ASAP. Meanwhile, an old colleague (there aren't many of them left) had spotted me and we chatted: yes, there was a new library system now; no, it didn't really work; no, there was no longer any senior technical person who could sort things out in-house; so, yes, I was missed (hah, finally!). So when I returned a fortnight later, could I get through the turnstile? No, of course I couldn't.

Well, there are no surprises in any of those examples: that is exactly the sort of thing we have come to expect, isn't it? We anticipate incompetence, incomprehension, delays, and frustration, as the world is constantly reshaped to no great purpose by disruptive corporate swashbucklers, who always promise much but leave instead a trail of impractical, half-finished schemes in their wake as they sail on to more senior positions somewhere else. But listen, guys: if you must "fix" or "disrupt" things (because how else would we notice you had ever been in charge?) then at least let the people who are going to use, say, some shiny new automated system have a thorough look at it first, just to make sure it does actually work in every important respect before you sign the cheque and add that new line to your CV. The same goes for your addiction to organisational shakeups, such as centralising / decentralising / outsourcing / insourcing all clerical assistance (essentially reversing whatever situation you found when you arrived on the scene), or forking over thousands of pounds to consultants simply to change the name, font, and logo on all the official stationery, websites, and corporate clothing. Sure, you want to make your mark, but couldn't we just add your name in nice gold lettering to a list on a board somewhere when you go? Wouldn't that do?

But, heads up, people, here comes so-called AI (artificial, yes; intelligent, no), which promises to combine the breathtaking arrogance and heedless ignorance of both worlds, managerial and technical, at their very worst. Really, trust me, this is going to be bad. Not because humanity will be exterminated by killer robots, but because careless implementations of AI will fuck up so many things that were already working perfectly well. [3] 

"Hey, there, you techie guys: how can we save even more money and maximise profits?"

"Well, Mr. Manager, here's a thought: why not sack everybody? Bold, I know! But we've bought in a little package here that can do anything all those expensive, troublesome, lazy, disease-prone people can do, but faster, better, and 24/7! Most of it is incomprehensible to us, admittedly, but we've tested it pretty thoroughly in various scenarios and it seems to get most things right most of the time... Besides, as always, we can iron out any little problems as we go along. So, sign here, please. Plus this liability disclaimer, if you don't mind; just a formality, obviously. After all, what can possibly go wrong?"

Royal South Hants Hospital, June 2023

1. Impossible, now, to recapture the excitement of being able to search for stuff online by keywords. Once, back in the early pioneering days, I wanted to show off to a visiting class of 12-year-old schoolchildren how our new library system could find any book, just by entering the first four letters of both the title and the author's name, separated by a comma. I have no idea what made me choose to search for Hard Times by Charles Dickens on the spur of the moment. There is clearly some internal, malevolent imp that lives, like a comedian of genius, one beat ahead of the action in my head. I hadn't even read the damned book; still haven't. But I carefully entered "DICK,HARD" on the terminal keyboard. Hey, kids, what's going on? What? What's so funny?? Oops! Let me just quickly change that...

2. Although not always. I recently watched a consultant surgeon become completely baffled by a new login procedure, in a priceless real-life recreation of that Norwegian "book" sketch. He had to summon a nurse to bail him out. Mind you, I should probably have warned that nurse what this act of mercy would do to his promotion prospects...

3. Here's an interesting article about the future of books, AI, and reading. Spoiler: books are great just as they are!

3 comments:

Stephen said...

All of this rings a bell Mike. Didn't realise you were a programmer.

Mike C. said...

Stephen,

Yes, but of the self-taught variety -- starting with GW-BASIC around 1985, then proceeding through various compiled and scripted languages, usually those provided as an API by a library system supplier (for example the Pick OS was marketed by McDonnell Douglas as REALITY, with a powerful scripting language PROC). By the end of my working life I was mainly using Perl in a Unix environment, along with Sed, Awk, and all that. I used to be pretty good at HTML, too, but haven't written a single thing since retiring.

(I'm assuming these are familiar names, if not, ignore me...)

Mike

Stephen said...

Mike — some of those names are familiar to me. I worked as a systems tester for a few years (UNIX / Oracle mainly) and now do a bit of basic web stuff part-time (HTML / CSS / WordPress).

I admire people who can do 'Proper' programming: I don't have the brain for it.

Cheers.