Remember how delighted we were when we first discovered that mobile phones can generate different sound effects? Ironically, many of us chose as our favorite an old telephone ringtone. That was back in the mid- and late 1990s, but our infatuation with mobile handsets has never waned.
Fast-forward 20 years. As evidenced at the International CES 2014, the automotive industry is exactly at the digital cusp that swept the mobile phone industry two decades before.
Carmakers are amazed by what digital technology can do, tantalized by its possibilities, but still hopelessly tentative (and embryonic) in their approach to the new automotive human-machine interface.
Renault talked at the show of its R-Sound Effect, which can pipe “six different engine sound profiles” into the cabin when the driver tires of the natural exhaust note.
Exhibitors and conventioneers also talked of bringing their favorite Cadillac or Corvette's instrumental clusters, or even antique car speedometers — digitally — to the newest cars.
This is all possible because car companies can leverage a human-machine interface (HMI) design tool, such as Nvidia's UI composer, to rapidly develop and experiment with innovative digital instrument clusters. The HMI design tool can render 3D, move light sources, and recreate “complex materials such as carbon fiber, brushed metals, stitched leather, and glass photo-realistically, and in real-time,” according to Nvidia.
That's exactly what we said about our old mobile phones 20 years ago.
With smartphones, it has taken time, some hard thinking, and several iterations in design evolution to arrive at today's UI. But if the goal is a sort of karmic simplicity, we're not there yet.
As for the automotive HMI? In my opinion, not even close.
With all the icons lined up, some in-vehicle screens look too complex and, dare I say, hideous. So far, all that we've learned is that the take-a-smartphone-and-wrap-a-car-around-it approach won't cut it.
Taking a cue from Tesla, the first automaker to install a 17-inch dash, many car OEMs are looking at that 17-inch screen space and trying to figure out ways to lay out their own center stack. Some do it with two screens (one for navigation and another for entertainment), and others do with many more screens.
With the emergence of the advanced driver assistance system (ADAS), the HMI conundrum gets even more complicated. It guarantees so much more information — including images captured by surround-cameras around the car.
Head-up display (HUD) or windshield projection is another new wrinkle carmakers need to sort out. HUD is a transparent display that presents data without requiring users to look away from the windshield view.
I've also heard that some carmakers are thinking about removing side-view mirrors (for aerodynamic reasons) in favor of images captured from side cameras and displayed in what's now called the in-vehicle cockpit.
Following is a slideshow of several in-vehicle infotainment and instrumental clusters we spotted during the CES. Many screens seem to offer just too much information. While the automotive industry, for decades, has talked about voice as the ideal nondistractive HMI for drivers, it turns out that some consumers are simply turning off voice-based command and control functions out of frustration.
Click the image below to start the slideshow.
DLP-based interactive display center stack
Call me old fashioned. There is something about knobs and dials that I love, because they’re tactile and intuitive. My personal favorite at the show was Texas Instruments’ DLP-based projected interactive display, whose prototypes were shown at its booth. DLP technology transforms the surface of a center stack into large-format touchscreen, allowing a center stack designer to use a “no-wire” dial (as shown in the picture). It fits in the same center stack space where Tesla used a 17-inch LCD screen.
This article originally appeared in EBN's sister publication EE Times .