The Future of the Car Is Right in Front of Your Face
If you’re a car enthusiast, new automotive features like direct injection, magneto-rheological suspensions, and carbon-ceramic brakes are things worth talking about. If you’re not a car enthusiast, you’re more likely to be interested in how easily you can program the GPS, pair your phone, or simply change the radio station.
For carmakers, then, success relies a lot on their ability to make the dashboard appealing and intuitive. We’ve been in an awkward phase when it comes to a car’s interior: We can do more things than ever from behind the wheel, but there’s little consensus on how to manage all those features in a way that’s easy, convenient, and, above all, safe.
The old solution—add more buttons—stopped working when the number of buttons required exceeded the available real estate on the dash. At that point, in the late 1990s, dashboard design basically split into two camps: multifunction controllers and touchscreens. The Germans (Audi, BMW, and Mercedes-Benz) all went multifunction, installing a knob near the gearshift that can be twisted, tilted, and pushed to navigate menus and screens on a display mounted high up on the dash. Other automakers, including Toyota Motor, Ford, and Volkswagen, went with touchscreens.
Both technologies have their drawbacks. Multifunction controllers and displays keep the visual information near the base of the windshield, closer to a driver’s line of sight, but many people have found the required spinning and tilting to be a maddening exercise in complexity. Touchscreens are simpler—see the thing you want, touch the thing you want—but to do that, drivers have to look at the display to make a choice, which means not looking at the road.
To solve these problems, companies are looking to new technologies like gesture controls. Gesture is tricky: The car has to be able to recognize a motion and know it’s intended for the car and not just someone gesticulating in conversation, but it does get rid of the need for additional hardware as well as any required visual confirmation.
The more established cousin to gesture control is voice recognition, which automakers have long promoted as a great way to interact with your car. The reality’s been far more frustrating. They’re still working on it: The challenge is for software to effectively parse natural-language requests (“Play Luther Vandross”) as opposed to proscribed chains of commands of dubious reliability (“Music. Select artist. Luther Vandross. No, not Miley Cyrus—Luther Vandross. No. Go back …”).
All of this dashboard innovation is not happening in a vacuum, of course. The other big tech push right now is toward more self-driving cars. The two are not unrelated: The more cars take over the actual driving, the more dashboards can be freed up for all the other things we’d rather be doing.