Now that a week has passed I think we can evaluate the implications of what Apple showed last week. To me the neglected part discussed was the touch pads. Why are they so big? I don’t think Apple gave a good reason at the Event and no one else has really explained them well. If you look at them they’re the size of a large iPhone. I think there’s something to that.
Microsoft’s strategy of adding touch to traditional operating systems with vertical touch screens seems a mistake. Outside of scrolling windows few people seem to use the functionality. That’s because even scrolling windows is easier with a track pad. I suspect the only reason Windows users use the screen is because so many PC track pads are inexplicably bad. Most of what I see Windows users doing can simply be done better with a track pad. And I say that as someone who doesn’t particularly like track pads.
The Touch Bar seems to offer some great functionality. First it gives visual feedback to what function keys do. One reason why function keys aren’t used by regular people is because ‘F5’ really doesn’t indicate what it does. If you’re having to remember with poor discoverability you’ve already lost. The Touch Bar solves that but also provides sliders and other tools that work on a touch screen but without the discomfort of holding your arm out straight. It’s poor ergonomics and a big reason why people don’t use iPads standing up as much. (As an aside it’s also why iPads being used standing up really deserve much better keyboard functionality)
I think the reason Apple has made the track pads so large is for what comes next. Instead of applications just treating the Touch Bar as a programmable view imagine being able to treat the track pad the same way. Now imagine the track pads being as high resolution as the iPad Pro and supporting the Apple Pencil. The only UI issue would be figuring out how to control switching from track pad mode to this interactive mode.
The implications go farther. The MacBook Pro already has an ARM chip controlling the keyboard. Exactly how much it controls the Touch Bar isn’t quite clear yet. (I’ve read some reports the GPU on the Mac is doing most of the work and the interaction is run on the Intel CPU but I’ve also read that it’s mostly compiled to and handled by the ARM CPU) Imagine though that Apple puts a more robust ARM chip on the keyboard to handle the new graphically interactive touch pad. Instead of running watchOS it runs iOS networked with OSX. Effectively you have a limited iPhone in the keyboard.
Now imagine something akin the the Apple TV remote app for OS X. Suddenly your iPad has a Touch Bar on top and a track pad controlled by Bluetooth on the bottom. An iPhone becomes either a Touch Bar or the track pad.
The implications are really quite remarkable.
I’m not saying this is what Apple plans. In some ways Apple is pretty inscrutable. However what I outline really isn’t that much harder that a Bluetooth Apple keyboard with the Touch Bar and a Magic Trackpad. Apple’s already going to have to figure out how to get the Touch Bar to work with an iMac. It’ll probably be Bluetooth. (Which is why I think the ARM chip is so important) When the new iMacs get released next year it may be more exciting than we think.