Sometimes you can tell right away that an idea is going to work. Bails that
light up when the ball hits the stumps, for example: say no more, sign me up.
But most ideas need a bit of time before you can tell if they’ve got legs. I reviewed the
first Touch Bar-equipped MacBook Pro back in 2016, for example, and noted at the time that it was too early for a verdict on the feature’s usefulness. Would owners get used to it? Would developers find good applications for it? The jury was still out.
(Full disclosure: while I did say it was too early to say whether the Touch Bar was a winner, I was definitely leaning towards optimism. Devs would “surely come up with reams of clever Touch Bar features”, I predicted, wrongly. Hands up: that was a bad call.)
Five years on, the jury’s verdict has finally been accepted by the judge, and Apple has removed the Touch Bar from its
latest MacBook Pro. And my sense is that Mac owners
won’t miss it.
Partly, as with 3D Touch, this failure is the result of factors outside Apple’s control: third-party software developers didn’t really go for it, and as a result the range of activities you could accomplish on or with the Touch Bar was uninspiring. But I now reckon there are more fundamental problems with it that reviewers like me probably should have spotted back in 2016.
The main error was to look at the all-software keyboards and touch interfaces on the iPhone and iPad and assume they would also work on the Mac, and bring the same benefits of customisability and versatility. But the use cases are entirely different, because the iDevices have their software keys on the screen, which means you’re looking at them and don’t depend on tactile feedback.
(As a sidebar to this discussion, it is of course possible for soft keyboards to offer tactile feedback through
haptics. That is one route Apple could have gone down with the Touch Bar, and maybe would have saved it. But I doubt it.)
It’s a fundamental part of the Mac design that the input area and the screen are separated: whether you’re using an iMac or a MacBook, the first is positioned low and the second is high. Which means that if you want to incorporate touch you’re doomed to choose between two major compromises.
Either you put the touch element on the screen itself and you have to reach up, which is uncomfortable and tiring. Or you put it on the keyboard and you have to use it without looking – which is fine for a physical input that always stays the same but works appallingly badly for a software input that changes constantly – or keep looking up and down between screen and keyboard.
How can you build muscle memory for a set of keys that change from app to app and even from menu to menu? How can you touch-type with no tactile feedback? The simple answer is that you can’t.
My colleague Samuel Nyberg has suggested that the failure of the Touch Bar spells
an end to hopes that Apple would one day release a fully touchscreen-based Mac, but I’d go further: it shows that the idea of a touchscreen Mac never made sense in the first place. And it’s hard to understand why it took the company five years to realise this.
Still, we should be grateful – as with the restoration of MagSafe and a good set of ports – that Apple acknowledged its error and offered
recompense and apology by way of an updated model. There have been occasions in its history when the company has been a tiny bit stubborn and refused to admit when it’s got something wrong.
And thank goodness we can be sure that Apple will never again give the MacBook Pro a controversial hardware feature that winds up lots of people and doesn’t appear to serve a useful purpose. Now
hold on a minute…
Different Think is a weekly column, published every Tuesday, in which Macworld writers expose their less mainstream opinions to public scrutiny. We’ve
defended the notch, argued that
Tim Cook is a better CEO than Steve Jobs, and called Apple TV+ a
disaster movie without a happy ending. See you next week!