Monday, 28 March 2011
Computering
Have you got a smartphone? I don't. I don't particularly want one either -- I think I'm probably too old, now. I like having separate devices for separate jobs, in the same way I like getting my gas from a company with "gas" rather than "electricity" in its name. I know, I know, convergence, convenience, interoperability, mash-up, blah, blah. But, listen, if you do have a smartphone, have you any idea how much trouble you're causing me, with your frivolous desire to twirl data around with a fingertip, or flip through apps like someone trying to get something nasty off the screen?
Let me explain. As I enter the final straight (strait?) of my employed life and can see the finishing line marked "retirement" approaching, I begin to realise I have lived through an extraordinary period in human history. Well, sure, OK, who hasn't? But, just as our grandparents were born into a world without aircraft and ended up having bombs dropped on them from a great height, so those of us who were not "born digital" are entitled to feel things may be getting out of hand.
When I was at college in the 70s, it goes without saying that no-one had a personal computer, but -- get this, kids -- no-one had a phone, either. No, really! If you wanted to know if someone was going to be around at 9 o'clock tonight, you either had to plan ahead, or simply go round to their house and find out. I would make one phone call a week, back home to my parents from a payphone next to the college bar. Now, when I cross the campus, I seem to be walking through a permanent all-day mass telephone conversation, with students at every turn talking on phones and gesticulating like TV news correspondents rehearsing a piece to camera.
Well, apparently, with the coming of these new-fangled smartphones, a lot of students now want to use them as well as (or even instead of) their laptop as their primary platform for communications of all kinds. So, because the university likes to see itself as responsive to the "student experience", and feels that we may have fallen behind in this respect, I and various other substantially better-informed techno-types have been summoned from our dark, cabled caves to put a smartphone offering together in a hurry.
As an English graduate with linguistic and artistic tendencies, I do not have the typical profile of a person with IT responsibilities, especially nowadays when you can actually take a degree in Computering. But, back in the day, almost anyone could find themselves in charge of a "system", and a roomful of green screens. Libraries have been in many ways pioneers in the adoption of automation, and I found myself in such a position in the mid-1980s.
Those were exciting times. The first micro-computers had begun to appear, but most serious computing involved "dumb" terminals (text only, typically green or orange on black, no graphics, no mouse, no software, no memory, no nothin') permanently connected to a host computer which performed all the heavy lifting. I remember sharing an office with our library mini-computer -- a steel box about the size of a large refrigerator, with a row of incrementally blinking lights and fans loud enough to inhibit any conversation, connected via two permanent leased telephone lines to a VAX mainframe in Bristol. The only souvenirs I now have of it are the two tiny padlocks that prevented any idiot trying to make a call with the telephone handsets that sat on top of the cabinet. Oh, and tinnitus.
In those days you could get a long way with not very much. I got the very first IBM PC in the library, a twin floppy disk model with no hard drive and 640K of RAM (640K!), a completely freestanding item of equipment like a typewriter, with all the hi-tech design flair of a 1970s caravan. Having grappled with WordPerfect and the like, I decided on a whim to teach myself to program with GW-BASIC.
You could get a wonderful little hand-lettered, spiral-bound guide from Cambridge University Press by Donald Alcock called Illustrating BASIC that told you everything you needed to know. It was very reminiscent of those hands-on, can-do alternative lifestyle manuals like John Muir's How to Keep Your Volkswagen Alive, if you remember them. To my surprise, I discovered I had an aptitude, and got stuck in. My life changed.
Without a graphical interface, of course, programming was both very simple and very free of constraints -- you made it up from scratch every time. You didn't use vast ready-made libraries of handy routines written by experts, you had to figure it out and write it yourself. If you wanted to sort data, hey, you wrote yourself a sort subroutine. It was very empowering, and a lot of fun. The association with alternative lifestyles was not misleading -- a lot of misfits found their niche via IT, and the early days of the Web in the 1990s, especially, were like a second coming of the late 60s. It didn't last of course, once the graphical dead hand of Microsoft Windows and the Apple Mac imposed businesslike conformity on everything. The "open source" movement is a last vestige of that pioneer impulse.
Anyway. Since those years I have noticed what seems to be a characteristic experience of modern technological life. That is, every few years you discover you've not been paying attention, and have missed out on some crucial new development, and then have to spend a week or two catching up in a hurry. It happens to everyone -- we're all such specialists these days -- but no-one likes to admit it. That's why sales of books with titles like "Teach Yourself Utopian Turtletop 3.0 in 24 Hours" are so healthy. A true geek will always prefer to read a manual or handbook in private to the humiliation of attending a course.
This time, for me, it's "Web Services". The use of XML as a lingua franca for web commerce (WSDL, SOAP, and all that) had passed me by, until now. In the library world, we get by pretty well without it -- we have long-established standards for data exchange which have suited us very well, and regular HTTP/HTML/CGI has been a pretty good platform for public interactions. That is, until those pesky students decided they'd like to look things up and get notices about their overdue books on their darned smartphones.
Never mind. From my perspective, it does seem like an appalling diversion, to have to read several 200 page manuals just so that information which is perfectly freely available on the Web can be pumped out to some rich kid's smartphone. Not least because the estimate is that only 25% of students have smartphones, and also because there are multiple, different smartphone platforms to accommodate. But, as always, the alternative is to be left high and dry by the crowd moving on to the next Big Thing. It doesn't do to wish things away. I remember, when I was being trained in the use of HTML, being told not to use graphics in webpages -- according to the instructor they were a distraction and an unnecessary burden on bandwidth. Oh, really?
Ignoring the trends can have very negative consequences. The supplier of that original, VAX-based library management system failed to spot the trend towards Unix as an operating system for platform-independent systems, and as a direct result went out of business. My office wall is furnished with the beautifully-produced manuals of various obsolete systems, services and languages I had to learn to love in years past. Who now remembers the McDonnell Douglas version of the Pick operating system, hubristically named "Reality", with its clever, data-dictionary-driven, programmable enquiry language, equally hubristically named "English"?
If nothing else, a life in IT teaches you that all things will pass. Plus the knowledge that, underneath even the glossiest, whizziest user interface, it all comes down to pages and pages of plain text coding in the end, laboriously entered at a keyboard in an office somewhere. Writ in water, unfortunately, all of it.
Subscribe to:
Post Comments (Atom)
6 comments:
Bloody computers!
I remember when I was at school - the school computer was the size of a small semi-detached cottage in Bedfordshire and could just about add up 2 plus 2 if the right piece of cardboard with holes punched in to it was inserted. I only ever heard of said beastie in whispers as I was never allowed to see it - I didn't get as far as the sixth form nor was I any good at maths ( a prerequisite, I believe, to enter said Aladdin's Cave).
Nowadays, most people have more computing power in their car or their iPhone than landed Mankind on the Moon. Kids have googled the answer to a question you haven't finished asking yet by the time you've told them to put the phone away.
Yet, somehow my home PC can switch something off inside that allows you to connect to your network so you have to spend a small fortune to get some Backroom Boy in to fix it despite everyone telling you "it's easy".
"The Singularity" is already here.
I had only ever played around with a Sinclair Spectrum, until I joined the University in 1995. What a learning curve that was. Within 18 months, I had been recruited to the Library Web Team...just like that.
I'm no IT expert but I am grateful for the experience and free training I received whilst an employee at Southampton. I left, equipped well enough to know that I can live without Microsoft. These days I have a Mac, and a laptop that runs with the fabulous Ubuntu.
By the way, Mike, I've recently acquired a Sansui AU-317II hi-fi amplifier. there's no phono socket for a CD player... it was made in 1974!
In the 1960s, I recall, it was reckoned that no one was capable of learning anything new after the age of 35, so the coming age of computers would see anyone in their late 30s doomed …
The first computers I used in journalism, in the mid-1980s, were dumb-terminal affairs, and when I became managing editor of a small magazine company in 1989 that used Macs (the old Classic, no hard drive, twin-floppy, with Microsoft Word fitting on one 1MB disc - tell young people that today …) I had to spend the first three nights reading the manual to try to know what to do the next day.
My first computer experience was as a math major at Princeton University in about 1962. The machine was an IBM 650, which filled the room and operated on vacuum tubes; that is to say, it pre-dated the use of transistors! The engineers across the street had a transistorized GE model which was about a zillion times as powerful as our dinosaur. My DSLR has more power than that IBM machine did.
Not quite as old, but similarly inclined when it come to computers - apart from the smartphone. I've got one, love it. But don't use it much as a phone (I'm still the once a week, set and appointment type).
I wish more website developers were like you: HTML only. That actually works pretty well on a smartphone: resizes, wraps text etc. It's all the fancy new stuff adding form over content that messes things up: I blame the fancier bits of XML entirely.
All,
An interesting spectrum of comments. How little account both "history" and "literature" take of the flavour of our everyday working lives...
At the risk of seeming over sentimental, it seems to me that organisations take far too little account of the wrench felt by IT staff when a shiny new system is brought in.
I well remember the feeling (almost of grieving) when I had to put away one set of much-used manuals for the last time, and pick up a new set, while outside everyone was cheering the advent of the much-anticipated new system, and dancing on the grave of the old...
One reason I've started to anticipate retirement is that I don't think I can handle yet another change of network, operating system, programming language, API, et bloody cetera...
Mike
Post a Comment