
Have you got a smartphone? I don't. I don't particularly want one either -- I think I'm probably too old, now. I like having separate devices for separate jobs, in the same way I like getting my gas from a company with "gas" rather than "electricity" in its name. I know, I know, convergence, convenience, interoperability, mash-up, blah, blah. But, listen, if you do have a smartphone, have you any idea how much trouble you're causing me, with your frivolous desire to twirl data around with a fingertip, or flip through apps like someone trying to get something nasty off the screen?
Let me explain. As I enter the final straight (strait?) of my employed life and can see the finishing line marked "retirement" approaching, I begin to realise I have lived through an extraordinary period in human history. Well, sure, OK, who hasn't? But, just as our grandparents were born into a world without aircraft and ended up having bombs dropped on them from a great height, so those of us who were not "born digital" are entitled to feel things may be getting out of hand.
When I was at college in the 70s, it goes without saying that no-one had a personal computer, but -- get this, kids -- no-one had a phone, either. No,
really! If you wanted to know if someone was going to be around at 9 o'clock tonight, you either had to plan ahead, or simply go round to their house and find out. I would make one phone call a week, back home to my parents from a payphone next to the college bar. Now, when I cross the campus, I seem to be walking through a permanent all-day mass telephone conversation, with students at every turn talking on phones and gesticulating like TV news correspondents rehearsing a piece to camera.
Well, apparently, with the coming of these new-fangled smartphones, a lot of students now want to use them as well as (or even instead of) their laptop as their primary platform for communications of all kinds. So, because the university likes to see itself as responsive to the "student experience", and feels that we may have fallen behind in this respect, I and various other substantially better-informed techno-types have been summoned from our dark, cabled caves to put a smartphone offering together in a hurry.
As an English graduate with linguistic and artistic tendencies, I do not have the typical profile of a person with IT responsibilities, especially nowadays when you can actually take a degree in Computering. But, back in the day, almost anyone could find themselves in charge of a "system", and a roomful of green screens. Libraries have been in many ways pioneers in the adoption of automation, and I found myself in such a position in the mid-1980s.
Those were exciting times. The first micro-computers had begun to appear, but most serious computing involved "dumb" terminals (text only, typically green or orange on black, no graphics, no mouse, no software, no memory, no nothin') permanently connected to a host computer which performed all the heavy lifting. I remember sharing an office with our library mini-computer -- a steel box about the size of a large refrigerator, with a row of incrementally blinking lights and fans loud enough to inhibit any conversation, connected via two permanent leased telephone lines to a VAX mainframe in Bristol. The only souvenirs I now have of it are the two tiny padlocks that prevented any idiot trying to make a call with the telephone handsets that sat on top of the cabinet. Oh, and tinnitus.
In those days you could get a long way with not very much. I got the very first IBM PC in the library, a twin floppy disk model with no hard drive and 640K of RAM (640K!), a completely freestanding item of equipment like a typewriter, with all the hi-tech design flair of a 1970s caravan. Having grappled with WordPerfect and the like, I decided on a whim to teach myself to program with GW-BASIC.
You could get a wonderful little hand-lettered, spiral-bound guide from Cambridge University Press by Donald Alcock called
Illustrating BASIC that told you everything you needed to know. It was very reminiscent of those hands-on, can-do alternative lifestyle manuals like
John Muir's
How to Keep Your Volkswagen Alive, if you remember them. To my surprise, I discovered I had an aptitude, and got stuck in. My life changed.
Without a graphical interface, of course, programming was both very simple and very free of constraints -- you made it up from scratch every time. You didn't use vast ready-made libraries of handy routines written by experts, you had to figure it out and write it yourself. If you wanted to sort data, hey, you wrote yourself a sort subroutine. It was very empowering, and a lot of fun. The association with alternative lifestyles was not misleading -- a lot of misfits found their niche via IT, and the early days of the Web in the 1990s, especially, were like a second coming of the late 60s. It didn't last of course, once the graphical dead hand of Microsoft Windows and the Apple Mac imposed businesslike conformity on everything. The "open source" movement is a last vestige of that pioneer impulse.
Anyway. Since those years I have noticed what seems to be a characteristic experience of modern technological life. That is, every few years you discover you've not been paying attention, and have missed out on some crucial new development, and then have to spend a week or two catching up in a hurry. It happens to everyone -- we're all such specialists these days -- but no-one likes to admit it. That's why sales of books with titles like "Teach Yourself Utopian Turtletop 3.0 in 24 Hours" are so healthy. A true geek will always prefer to read a manual or handbook in private to the humiliation of attending a course.
This time, for me, it's "Web Services". The use of XML as a
lingua franca for web commerce (WSDL, SOAP, and all that) had passed me by, until now. In the library world, we get by pretty well without it -- we have long-established standards for data exchange which have suited us very well, and regular HTTP/HTML/CGI has been a pretty good platform for public interactions. That is, until those pesky students decided they'd like to look things up and get notices about their overdue books on their darned smartphones.
Never mind. From my perspective, it does seem like an appalling diversion, to have to read several 200 page manuals just so that information which is perfectly freely available on the Web can be pumped out to some rich kid's smartphone. Not least because the estimate is that only 25% of students
have smartphones, and also because there are multiple, different smartphone platforms to accommodate. But, as always, the alternative is to be left high and dry by the crowd moving on to the next Big Thing. It doesn't do to wish things away. I remember, when I was being trained in the use of HTML, being told
not to use graphics in webpages -- according to the instructor they were a distraction and an unnecessary burden on bandwidth. Oh, really?
Ignoring the trends can have very negative consequences. The supplier of that original, VAX-based library management system failed to spot the trend towards Unix as an operating system for platform-independent systems, and as a direct result went out of business. My office wall is furnished with the beautifully-produced manuals of various obsolete systems, services and languages I had to learn to love in years past. Who now remembers the McDonnell Douglas version of the Pick operating system, hubristically named "Reality", with its clever, data-dictionary-driven, programmable enquiry language, equally hubristically named "English"?
If nothing else, a life in IT teaches you that all things will pass. Plus the knowledge that, underneath even the glossiest, whizziest user interface, it all comes down to pages and pages of plain text coding in the end, laboriously entered at a keyboard in an office somewhere. Writ in water, unfortunately, all of it.