Rumors that Apple is working on some sort of television set never seem to die down. Speculation on this subject has abounded for years— including the occasional article here at Macworld—fueled most recently by Walter Isaacson’s biography of Steve Jobs and noncommittally confirmed by his successor in an interview with NBC’s Brian Williams.
There is, of course, no knowing if an Apple-branded TV set is really coming. The folks from Cupertino have shown that they are nothing if not unpredictable, forcing pundits left and right to speculate and then harriedly explain how the company’s moves really make sense with the benefit of hindsight.
Speculation aside, however, sitting on the couch to watch a show reminds me constantly of how much room for improvement there is in the way we currently consume television, starting from the way devices talk to each other.
The digital hub
My living room is, in many ways, typical of most North American households: We own a couch, a TV set (in my case, a 55-inch 2009 Sony Bravia LED flatscreen) and a digital Dolby 5.1 surround system.
A number of devices are attached to the TV: a Playstation 3, an Xbox 360, a Nintendo Wii, and an Apple TV. Until not long ago, a Western Digital TV Live completed the setup to play our ripped DVD collection, although we have recently replaced it with InMethod’s Air Video for iPad playing through the Apple TV over Airplay.
All these electronics come with myriad accessories: controllers for the consoles (and their chargers), a Microsoft Kinect, an infrared remote extension for the Playstation, and so on.
The result is that the back of my TV stand has turned into a jumble of cables of epic proportions—one that I am, frankly, afraid to step into. I don’t mean this in a metaphorical sense, as if it were the lair of some magical monster; I mean it quite literally: I know that every time I reach around to plug something in, I inevitably end up unplugging something else, and then have to spend the next half hour figuring out whence that orphaned cable came.
This is not the living room of the future that I was promised twenty years ago. All the talk of digital hubs and wireless technology has died down into the snake pit that lives behind my TV set, fueled by an absurd surplus of cables that are both redundant and costly: Each device needs at least a power cable, plus an HDMI cable that connects to the back of the TV—and that’s a best-case-scenario, because older electronics may need up to five different cables to carry component video and optical audio.
By contrast, the cables that connect my computers to their peripherals have become easier to manage over time. Manufacturers have worked hard to reduce clutter and introduce interfaces that make it easy for multiple device to communicate with each other. As far as I’m concerned, Apple has a clear winner here with the Thunderbolt interface, which has more than enough bandwidth to carry high-quality audio and video and power at the same time, drastically reducing the cabling requirements of your entire A/V setup.
Of course, simply introducing a TV set with a Thunderbolt port won’t make every other device magically compatible with it; however, its advantages could fuel consumer demand for broader support, leaving electronic manufacturers with little choice but to follow suit.
Besides, Thunderbolt is not a proprietary Apple interface: It was developed by Intel, a brand with which the electronics industry is already intimately acquainted because of the HDCP digital rights management technology that pervades today’s high-definition content. Through a well-managed licensing program, Intel and Apple could allow others to adopt Thunderbolt as a standard without feeling that their products would be subjugated to the latter’s whims.
One of Thunderbolt’s primary advantages is the fact that it allows up to six devices to be daisy-chained to each other. In part, this means that you are much less likely to run out of ports on your TV set the way you are today: A TV set with two ports could service up to twelve peripherals—plenty to satisfy even an unusually complex setup like mine. It also means, however, that devices can all communicate with each other, and that opens up a whole new way of interacting with your A/V gear.
The five remotes that I would normally need to operate all my electronics contain a grand total of 217 buttons. I know, because I counted them specifically for this piece. If my son wants to play with the Xbox, for example, he needs to first turn on the TV, tune it to a particular input, then turn on the audio system, tune that to a particular input, and, finally, turn on the console.
To simplify this NASA-worthy sequence of events, I purchased a universal remote—in my case a Logitech Harmony 900—and programmed it so that a single button would take care of everything. Still, it’s an additional $230 purchase just so that turning on my TV doesn’t take longer than watching a show.
This entire user-experience paradigm, proof that dinosaurs once roamed the Earth, is a throwback to a time where “watching TV” actually meant dealing with the set itself to tune in to the proper channel. These days, however, we treat the tube more like a monitor, and the interaction happens with the devices that we attach to it, making the process considerably more complicated.
So far, Apple’s response to this problem has been to come up with a remote that only has a handful of buttons—hardly a game changer, particularly when you consider that the Apple TV is not much easier to use than any of its competitors.
I can’t speak for everyone, of course, but this feels like the wrong approach to me. I expect the electronics around me to understand what I want to do without forcing me to learn how to use them, and I would much rather walk into the living room and say something like “I’d like to use the Playstation 3,” or “Play the next episode of Scrubs” than have to fumble with five remotes or invest in a new device just to turn everything on in the right order.
This is where I would love to see a service like Apple’s Siri emerge to replace the aging remote—perhaps not completely, because nobody wants to have a shouting match with the TV set at two in the morning, but certainly for most everyday uses, turning the complex hierarchy of on-screen menus we have to deal with today into a simple command-based interface that anybody could grasp. (Plus, as our generation gets older, a smarter TV set means that our children won’t be able to make fun of us because we can’t make the blinking zeros disappear from our VCRs.)
Identity crisis in the living room
A voice-based user interface, perhaps coupled with some sort of video recognition, could also dramatically improve the way a family like mine chooses which content to watch.
Most on-demand services have invested heavily in recommendation engines that tailor their offerings to the preferences of individual users, but these are often hindered by the primitive user interfaces we have to deal with, which are unable to easily distinguish between various members of the family.
Take, for example, Netflix—a company that blazed the trail of on-demand content. Every member of my family uses the same account, leaving its recommendation engine with the impossible task of reconciling the tastes of someone who, as far as it can tell, likes military movies, Japanese anime, My Little Pony cartoons, and the Transformers franchise.
It’s unlikely that any algorithm will be able to come up with suggestions that match these tastes—precisely because they do not reflect any one person’s preferences.
A Siri-like service, however, could be trained to distinguish each member of my family by their voice (or possibly, by their appearance—something that Microsoft’s Kinect is already capable of doing), discreetly building a separate profile for each one and providing a user experience that better matches the way each of us uses TV.
My TV, my way
When you put all these possibilities together, they paint in my mind a compelling picture in favor of an Apple-branded TV set that goes well beyond the hardware. A shiny new Jony Ive-designed TV would undoubtedly look good in my living room, but its real selling point may well be Apple’s ability to cut through old paradigms and change the way I do things in ways I could never imagine—or that I could imagine but that nobody has yet been able to deliver.
There are many hurdles that need jumping, not least of which is the need to work alongside an industry in which competition is fierce and collaboration often has to be mandated by law. That might explain why it’s taking so long for the company to get a product to market. If and when Apple does release a television set, however, there is a good chance that the way we consume TV will change—and for the better.