Science fiction visions, especially as presented in movies and television shows, have infected our approach to UX and product design. They have nudged us to mistake visualizations that were created for maximum dramatic appeal for solutions that are feasible and desirable, especially over the long term. What looks cool in a science fiction film is frequently frustrating, distracting, and convoluted to use.
I recently got a chance to preview Amazon’s “Home of the Future.” It was the oldest thing I’d seen in years. You can see why in this photo I took of the interior: Sterile and impersonal, it looked less like a home than a showroom, harkening back to retro-future themes from Monsanto’s vision of “the kitchen of the future” from the 1950s. Amazon innovates in many different ways, but this home takes voice to an unnecessary new level. For instance, you can ask Alexa to make some popcorn for you, but you have to include both the brand name and the weight of the popcorn in your request.
Amazon’s “Home of the Future” subscribes to an outdated idea about what convenience, coziness, and usability really feels like. Consider several technologies we first saw in science fiction that have evolved into whole industries:
Since its debut in 2002, Steven Spielberg’s Minority Report has been an archetypal reference point for technologists enthralled by the way Tom Cruise could manipulate complex data with dramatic swipes of his hands.
But what looks amazing for a few minutes of screentime doesn’t necessarily perform as well offscreen. Can you imagine how sore your muscles would be after using this system for eight hours a day?
I can: On a visit to the MIT Media Lab in 2008, I tried out a gesture control system just like the one from Minority Report. At first, I loved how it looked and felt, but I quickly noticed my arms aching. While gesturing with my hands above my heart, less blood pumped into my arms, causing strain. I couldn’t possibly imagine using a gesture control setup like that for over an hour. Picking up and dragging files is extremely tedious. Tom Cruise even shows us how frustrating the user interaction is in the actual film: There’s a scene where he reaches out to shake the hand of a visitor, and everything on his screen drops! Nevertheless, Minority Report has inspired businesspeople to dream up systems that would require people to use controls like this all day.
What’s more, I realized that not everyone has the fine motor control to learn precise gestures and perform them with consistent accuracy. The system would have to filter out quite a few false positives, which require far more computing resources than a basic button or multi-touch system. Anyone familiar with Fitts’s Law, which suggests that user response time greatly depends on the size of a target and the user’s distance from it, can see the problems inherent in the Minority Report system: It requires large, sweeping motions for even the smallest of interactions, while similar actions aren’t clustered together. I would never want a system like this to play Mario Brothers when a simple wired video game controller is precisely responsive 100% of the time.
By contrast, consider some more feasible, if less glamorous, alternatives: Nonprofit tech lab Dynamicland has created projected interfaces that don’t require gesture recognition (above). You just use a laser pointer to interact with the surfaces. That way, you can keep your hands at waist level and use less effort.
For that matter, during your next visit to a restaurant, watch the waitstaff type in an order or settle a bill through a touch-based point-of-sale system. These are often ugly but speedy interfaces. The buttons are large, the tablet sits at an angle, and completing a task requires no more than a few taps that quickly become muscle memory.
I once met one of the top designers of “futuristic interfaces” for major Hollywood movies. He told me that any time he tries to design something that’s actually usable, clients shoot him down. They want the fancy panels and transparent glass interfaces people expect from films. He is kept from promoting better user experience in films by this aesthetic he helped create.
Since at least the first Star Trek series, it’s been a standard trope: In the future, we’ll simply speak out loud to an “intelligent” agent that would instantly understand our vocal patterns the moment we express them.
Cinematically, this technology makes a lot of sense: It’s more interesting and narratively efficient to watch Tony Stark solve technical problems while pacing in his lab and talking to Jarvis, as opposed to watching him sit silently at a desk, typing into a computer.
But that does not mean voice-controlled interfaces are always a feasible real-life solution. Unlike in a movie, in which dozens of crew members tightly control and modify the audio of every scene, we usually live and work in places full of noise that can easily confuse voice activation. The technology also assumes that everyone speaks with standard, precise diction, even though many or perhaps most people do not. But all that aside, the goal of an artificial intelligence capable of understanding common sense requests remains decades away.
Which takes me back to Amazon’s “Home of the Future” that I recently toured. The creators of the house boasted that all of the appliances were connected to Alexa voice activation. But when I asked Alexa to microwave popcorn, she replied, “What is the weight of popcorn?” Have you ever known exactly how much popcorn you’re popping? I finally looked for the weight of popcorn listed on the bag and told her the amount, but it didn’t pop all the way. To get water from the faucet, I had to ask Alexa to tell Delta — yes, you literally had to address the appliance by its brand name — to “fill the glass with 4 ounces of water.”
To be fair, there was at least one instance where precise measurement sort of worked: for activating the shower. While getting prepared, you could simply ask Alexa to warm the water to your ideal temperature. But overall, Alexa had mainly succeeded in making everyday conveniences uniquely inconvenient.
It may take years of rewiring to cultivate a design community that’s free from sci-fi’s framing biases. Doing that might require an internal checklist that includes questions like this:
In the same way that a parlor trick makes us wonder how the flashy payoff hides the illusionist’s sleight of hand, we should be suspicious when a product provides a dramatic “ta da!” Google and Craigslist are incredibly boring to look at and haven’t had their user interfaces significantly updated for nearly two decades, and partly for those very reasons, they remain incredibly useful. They’re pass-through interfaces that work based on speed. When we need to use something again and again, it’s less important that it looks flashy or impresses dinner guests. It needs to use the least amount of attention and effort, just like a light switch.
Sci-fi depictions of futuristic products are typically presented for a few minutes or seconds in an ideal context. For a more accurate representation, consider how they would fare over the long term in different environments, especially when used by multiple kinds of people (children, disabled, elderly) throughout an entire day, multiple times a day for years. If your appliance requires an iPhone app to work, consider the day when iPhones might not be the dominant device. Or that when someone divorces or sells their home, they’ll need to hand over the app permissions to a new owner.
Most technologies come with inherent costs, not just in the initial purchase, but for the near-inevitable requirement of buying batteries and other replacement parts over the years of its life. Very few technologies prevent monetary loss, and this is a key way emerging tech can truly benefit users. Coming up with a concrete cost estimate can help us better weigh any supposed advantages the product is meant to provide. For instance, consider this idea for a Home of the Future: When rain gutters become clogged, water can’t be diverted away from the house, causing it to freeze and crack the foundation in the winter. A solar-powered gutter sensor that glows orange when it’s filled with leaves would be visible at night — and save thousands of dollars in foundation repair bills.
Finally, consider asking yourself a question that was echoing through my own mind as I wandered Amazon’s Home of the Future: Does this solution enhance your ideal vision of that context?
When I picture a home that’s a model of domestic tranquility, I visualize happy family and friends sharing dinner outside in the sun, surrounded by trees and flowers. They prepare their meal together, perhaps with ingredients picked by the kids from their garden, in a kitchen full of chatter and personal mementos. I suspect most people, when asked, would describe a similar vision. Notably, automation and other high-tech enhancements don’t seem relevant to this pleasing mental image; they might even threaten to become distractions to that happy idyll.
Science fiction is important because it helps us think about how technology might change who we are, and we can use it to envision multiple outcomes, both good and bad. But when it comes to translating cinema to everyday life, it’s usually best to leave fantasy worlds to the screen.