The biggest problem with computers, as I have always thought, is the whole business of getting your wishes out here to link up with what the computer is doing in there.
Many times, when I’ve been struggling
with against a computer, usually a PC, but occasionally with machines whose inventors really were trying to be helpful, but were ahead of their time – I’ve imagined a future without keyboards, mice, trackpads or screens. In place of a screen, you’d have a heads-up display that would appear before your eyes – perhaps by direct stimulation of the visual centres of the brain. For an input device, your thought alone. I expect you’ve also thought along these lines. The computer itself, as a physical entity, would be irrelevant. It might be a ring, a pair of glasses, an ear-stud, or even completely distributed, in the cloud.
Many years ago when the world was young, I was excited by a report I read somewhere – it might even have been in Your Favourite Weekly Professional Science Magazine Beginning With N – of research in Germany showing how people with extreme disability could be trained to control a cursor on a computer screen by the power of thought alone. I haven’t been able to find the paper in question, but no matter – research on brain-machine interfaces has been waxing strong for very many years. Research originally designed to understand the basic science of how thoughts could be turned into actions, or to help profoundly disabled people live better lives, have – inevitably – turned up in consumer products, such as interfaces for computer gaming.
But I digress.
Progress in computers that are useful to everyday mortals such as me, as opposed to techie types who are primarily interested in computers for their own sake, should be measurable not in gigabytes or clock speed, but in the transparency of the interface. The reason why, I think, people revered St Steve of Jobs was that he got this. He invented the mouse as a way of executing user commands intuitively, and his company, Apple, pioneered many other, similar wheezes such as the use of gestures on touch-sensitive screens, as ways of converting your thoughts into action as quickly and painlessly as possible.
First I got an iPhone 3G and was immediately struck by the fact that an item of technology this complicated didn’t ship with a voluminous instruction manual. Then I got an iMac to replace my PC, and it felt like I didn’t have to fight against a computer to do anything. Then I got an iPad, which – and I still have the Mk1 – is my ideal road warrior. I write on it, I take notes, I work, I watch TV, I listen to music, I play Scrabble, I surf, I read books and magazines, I’ve interviewed people for a magazine article using a voice-recording app – and when I can’t think of any better way of wasting time, I compose using GarageBand. The iPhone 3G became an iPhone 4, now spruced up with iOS5.
Then came the iCloud, which has barely been out ten minutes and is already revolutionising the way I work. Tasks have, quite suddenly, become device-independent – finally I can concentrate on the work I am doing rather than on the device I’m doing it on. (Yes, I know, there has been Dropbox and all sorts of other cloud solutions, but iCloud seems to be more intuitive – it makes sense to a user such as me who is not a techie.)
And news has reached mes oreilles that the iPhone 4S (yet to reach the Maison des Girrafes) has a feature called Siri, in which voice-control software has been married to an AI to make a kind of virtual concierge, a personal digital assistant in fact as well as name,
which who does all the fiddling around inside your phone, so you don’t have to. Now, I haven’t met Siri for myself, but the promo video is like an episode of Knight Rider.
You can see what’s coming, but I have news for you – it’s already here. I’ve just received this mind-frying item of news from my friend and Mac Guru, Mr J. McQ. of Hackney, on an iPhone 4S mind hack. Yes, it’s a gizmo in which the thoughts of a user wearing a headset are translated into speech using a speech synthesiser, which is then fed into Siri, to execute a command. In this case, the user instructs Siri to dial a contact, by thought alone.
Yes, it’s a bit clunky right now – as you can see from the video, the mind hackers are currently using a rig that includes a burst balloon, a wetsuit full of lumpy custard, a jam jar and a stuffed cayman, but if they can come this far mere minutes after the iPhone 4S was released … well, you’re way ahead of me. Knight Rider has suddenly become altogether more serious. Did I hear Neuromancer?