It struck me recently that it has now been three years since I retired, and I haven't written any programs, scripts, or HTML pages in all that time. Not one. In a sense, of course, that is precisely what retirement means: you have stopped working, your accounts have lapsed, your administrator passwords have been terminated, and all the accumulated, undocumented knowledge you carry around in your head has been retired, too. Access denied! I can almost feel the empty space in my head where it used to be. But, still, I'm surprised: I had come to think of my modest code-cutting adventures as a core part of who I had become, and fully expected to continue them into retirement, perhaps even writing some apps for smartphones that would make me stinking rich.
But, no. Of course, I also haven't done any meetings, seminars, appraisals, training sessions, or conferences in that time, either. But I was never going to miss those aspects of a working life; who would? Conferences abroad are fun for a few years, but air-travel, hotels, and tedious afternoons in sunless seminar rooms quickly pall as you get older, particularly if you have to juggle childcare arrangements. I do miss the people, mostly, but then almost all of "my" people have moved on or retired, too. I think what I most sorely miss is the daily craic on the smokers' table* in the Staff Club at morning coffee-time, but then that pleasure had already come to an end years ago as the number of smokers (and "honorary smokers") declined, and then finished off by a ban on smoking indoors. Somehow, skulking in a corner out of the wind and rain is not conducive to life-enhancing gossip and banter.
But what about all that computer stuff? Do I miss it? All the operating systems, hardware configurations, and coding languages I learned, the systems and project management expertise built up over 35 years, or the highly-specialised data-handling expertise acquired? What will happen to the part of my brain that used to get such regular exercise? Well, I suspect it has already and quickly been occupied by another, more colourful part, suppressed for too many years, like a defunct office space eagerly turned into an artist's studio. Besides, if I'm honest, in my final years of working I found that coping with the constantly accelerating pace of change was making me anxious and unhappy. Working with IT eventually teaches you two profound life-lessons: first, that all your achievements are ephemeral, to be washed away in the next tide of change and, second, that nobody understands or cares what you have done, anyway. Instructively, after 30 years of service I received a perfunctory retirement letter from the central university administration, the main burden of which was to remember to hand in my keys before I left.
I suspect I may even be becoming a neo-Luddite. As I wrote in an earlier post, I have come to regret my role in the dumbing-down of university life, much as I enjoyed every minute it. What fun it was, to rise to the challenge of planning a major project, and what pleasure was to be had in meeting and overcoming all the technical problems thrown in our path! This, despite the knowledge that (repeat after me) your achievements are ephemeral, to be washed away in the next tide of change, and that nobody understands or cares what you have done, anyway.
But, when the basic strategic direction is wrong, all this counts for nothing. Or as the motto of my secondary school (not to mention the City of Edinburgh) has it, nisi dominus frustra:
Except the Lord build the house, they labour in vain that build it:The planning and, um, execution of Stalin's gulags may have been perfectly brilliant, but history will award no prizes to those who actually devised and carried out such massively complex programmes of bureaucracy and logistics. Which does seem a little unfair. Although, even if Stalin had turned out to be one of the good guys, I expect history would merely have sent them the standard form letter, thanking them for their contribution, and reminding them to hand in their keys.
Except the Lord keep the city, the watchman waketh but in vain.
Psalm 127
In the end, this is probably hard but fair. Cleverness unconstrained by wisdom may yet be the downfall of our species. Think of so-called Artificial Intelligence, which we might usefully think of as humanity's attempt to outsource its own most distinctive feature, perhaps best represented by that traditional cartoon of a man sawing off the very branch he is sitting on. Over my working life I have witnessed several waves of happily-employed, good, ordinary, decent people being made redundant and their lives rendered purposeless by clever technology. It sometimes seems that technologists will not rest until the last opportunity to enjoy a meaningful life through work has been eliminated. The advent of AI, of course, will be their final ironic triumph: cleverness itself will have become redundant! When, I wonder, will it dawn on those setting our strategic directions that the pursuit of efficiency, productivity and profits by automation and the elimination of expensive, fallible "human resources" is not the point: people are the point, and not the problem!
In the words of everybody's favourite 13th century Sufi mystic, Jalāl ad-Dīn Muhammad Rūmī: Sell your cleverness, and buy bewilderment. You know it makes sense.
* No, not crack smokers, fool!
11 comments:
Mike,
incidentally, just this week there was an advanced-level C++ training at work which I organised for our department. I hired the well-known expert Angelika Langer as instructor, and since we've been very satisfied with her in the past, I asked her whether she was willing to hold another training covering the new C++11 standard. I was told that nowadays C++ is so low in demand that she can't make a business case for preparing such a training.
Ouch. Apparently we're building our careers on a "legacy" language.
On the other hand, the skills you name in your post - planning a project, breaking tasks down into manageable bits, keeping a mental map in your head *are* software engineering. These are much more important than keeping up with the newest platforms, languages and APIs. By the way, I believe that they are also fundamental for successful artmaking.
Best, Thomas
Thomas,
Ouch, indeed. You'll know you're getting old when the thought of learning just one more new language -- or even just coping with one more upgrade to your familiar environment, with its usual round of broken tools, "deprecated" routines, new bugs, unwanted features -- just makes you feel tired...
So, repeat after me: "first, all your achievements are ephemeral, to be washed away in the next tide of change and, second, nobody understands or cares what you have done, anyway"... ;)
Mike
Ha! You've reminded me of my 18 month exile *ahem* I mean secondment, to CAMS. There, we built an online postgraduate research prospectus from scratch, and within a few short months of completion, it was all but dumped. Staggering waste of time and money. See also, the ill-fated Learning Centre. What was that all about?
Martin,
Indeed. Repeat after me: First, [etc.] ...
Mike
Mike, further to your comments on AI, I think we're all overblowing that group of technologies. I remember on definition a few years ago, to the effect that AI is the name for a set of applications that we haven't really solved yet, with the implication that once the problem had been solved it moved out of the realm of AI into the ordinary. We seem to have moved a bit beyond that. But I did much enjoy Andrew Molitor's piece on Automation, at http://photothunk.blogspot.co.uk/2017/07/automation.html . The key bit:
'There's no "person" in there in any meaningful sense, and in fact nobody knows how to even get started making a software "person." This does not appear to be a current area of study. The software that turns your spoken words into text has no relationship with the software that plays chess at a Grand Master level. The phrase "neural network" does not mean that the software resembles a brain in any meaningful way, the phrase just means another remarkably simple algorithm (inspired by real neurons) which can produce useful results.'
So, not that we shouldn't worry, but not to worry about that, yet.
On a different note, I had a bit of that working on redundant systems: in 1971 I was sent to Australia by ICL to help write 1900 DBMS software. We had a team of up to 50 and worked on it for 3 1/2 years, and in the end nothing was released, the project was abandoned and we were all made redundant! A bit depressing. But it does remind me that the skills of programming and project management are generally quite advanced compared to the... perhaps art? or craft? ... of future project risk assessment!
"There's never time to do it right, there's always time to do it over"... the old ones are the good ones! ;-)
Chris,
Thanks for these thoughtful comments. Yes, of course, I exaggerate a little, but every step along the road to "efficiency" (container shipping ... wordprocessing ... now driverless vehicles ...) has made some group of ordinary folk redundant. My point is that people need meaningful work to do, and yet so much ingenuity goes into taking it away, and putting nothing in its place. What is a strong young man of average or below intelligence and no desire to work in an office or warehouse supposed to do with his life?
The irony is that those of us who have worked with IT are both responsible for much of this and at the same time most acutely aware of it -- my 35-year career spanned the eras of 80-column punched cards and developing apps for smartphones!
Mike
Hah!
I retired (early, I am a trifle younger than you) a year and a few months ago. I look after children, cook, clean, and occasionally bloviate. I also retired from Writing Code, and have also written exactly zero lines of code since I hung it up. Well, I do go fill in the bits and pieces of a div tag from time to time as part of bloviating (blogging) but not one bit more.
It truly is exhausting to "keep up" these days. I still watch the industry, and I am daily very happy not to be in it. Even after a long and horrid day of chasing grouchy little kids around and cooking overly complicted meals.
Odd, isn't it? As a self-taught amateur, I actually enjoyed writing APIs, Perl programs and CGI stuff to keep my colleagues happy and gainfully employed, but started to feel overwhelmed when XML, "web services", smartphones and such came along. It's gone the way of car maintenance... No point in getting out the feeler gauges and getting your hands oily, sir, it's a sealed unit and will need complete replacement by a specialist.
But those overly-complicated meals... I remember those! Mine are 23 and 26 this year, and can cook their own damn meals...
Mike
https://nytimes.com/2017/06/24/opinion/sunday/artificial-intelligence-economic-inequality.html
Zouk,
Exactly, couldn't have put it better myself!
Mike
Also:
"At the highest levels, the Chinese People’s Liberation Army (PLA) also recognizes and intends to take advantage of the transformation of today’s informatized (信息化) ways of warfare into future “intelligentized” (智能化) warfare."
https://lawfareblog.com/alphago-and-beyond-chinese-military-looks-future-intelligentized-warfare
But note
"However, for the time being, AlphaGo represents a high-profile demonstration of the sophistication of U.S. AI."
should read
"[...] U.K. A.I."
(AlphaGo's developer, Deepmind, is a British startup, sold to Google -- with riders -- so as to be able to use their network to power it.)
Post a Comment