One of my methods for professional development is to try and read things that I really don’t understand. It can be frustrating and fruitless, but sometimes leads me to stray into interesting domains on the periphery of library and information concerns. Internet, webbery, bioinformatics, science policy, public engagement, even blogging – all have been ripe for my ignorant eye. Sometimes I even try to read Henry Gee’s blog ;-).
Earlier on I perused the latest issue of JANET News (usually a publication I do not linger over, concerned as it is with the technicalities of the UK academic computer network). I was nearly through skimming for comprehensible morsels when I found some extraordinary statements about the end of the world as we know it1. I am usually averse to apocalyptic pronouncements but this article gives some food for thought, speaking of a coming technological singularity
our technological progress in a number of key areas is just starting to reach … the point where … near-imperceptible increases in the rate of progress suddenly become large enough to demand our attention. The relatively slow accumulation of knowledge in the past is set to accelerate beyond our ability to imagine
OK, so things are changing. No surprises there really. And I’m sure you, dear readers, have an almost infinite capacity to imagine. But hang on a moment, next it says:
The technology curve will go vertical some time in the 2020s, causing technological change so rapid and profound it represents a rupture in the fabric of human history
Ouch. That sounds potentially painful and probably rather dangerous. But it gets worse:
the half-life of any given skill we might learn will get shorter as these changes accelerate. The key skill will therefore be the ability to learn
So no sooner have I mastered one technical trick or new piece of kit than it’ll be old hat and I’ll have to forget it and learn it over. Actually, on reflection, I usually forget and have to learn over again anyway so maybe there’s not so much difference.
Just as I’m starting to despair about the future, the article concludes with a sheepish admission that:
there are of course those who dispute either the timetable or the possibility of singularity at all
Oh, so it’s all just a high-grade Daily Mail-style piece of flummery then? No need to worry after all. Or is there?
1 JANET News, Sept 2008. It’s the end of the data networking world as we know it (this links to the whole issue as a pdf. The article in question is on page 11)
Sometimes I even try to read Henry Gee’s blog
I’ve tried to read that, too. If you understand anything — anything at all — you will let me know, won’t you?
But slightly less trivially – the technological singularity is a staple of current SF, particularly authors such as Charles Stross. Indeed, it was an SF author, Vernor Vinge, who first coined the term ‘singularity’ to refer to a point at which technology would be advancing so quickly that only droids could keep up with it.
As you can imagine I come across this concept from time to time in my capacity as
Galactic Emperor, Mekon and Undisputed Ruler of All Living Thingseditor of the Nature SF strand, Futures. I was particularly amused by this story, The Charge-Up Man by Catherine Shaffer (_Nature_ 444, 652, 30 November 2006), which looks at the singularity in a fun and unusual way.Ah, I should’ve known the sci-fis got there first. Maybe I should add some sci-fi to my list of reading matter. Interesting to read about the technological singularity in such a dry source as JANET News though.
Aha! If you’d like some SF, modesty forbids my suggesting this book as a very good primer, featuring 100 very short stories from the full range of modern SF.
Dang it, I did suggest it.
I wonder how that happened?
bq. The technology curve will go vertical some time in the 2020s,
Please, nobody think about that for too long. I think there is a touch of exaggeration.
Words, not to mention worlds, fail me. Good catch, Frank.
That article seems alarmist to the absurd degree. Or am I just being short-sighted and un-droid-like? Seems to me that until artificial intelligence is a reality, the acquisition and generation of data, as well as technological development is well under human control. Am I missing something here? Pretty sure I am, but thought I would drop my two cents in anyway.
The article is very confused (or else deliberately confusing). The problem is nothing to do with exponential growth, which happily continues growing forever. If you ploy exponential growth on logarithmic scales you get a straight line – there are no singularities. The author is talking about singularities, which more simply occur when there is a finite resort that is being exploited. I am sure there are some genuine concerns within the article but they have not been illuminated correctly.