Human Machine Interfaces with NAIAS 2017 Debrief


Agenda, Attendee List, & Presentation files now available to Autotech Council members in the library.

These are crazy days for automotive HMI. In-car features are increasing fast, display and input technologies are rapidly changing. And while automation is taking some of the cognitive load off, connected apps and cloud services are driving the load up. Drivers today need to be given the appropriate blend of vehicle information, entertainment, and info about the outside environment. The connection between the car and the driver is becoming tighter than ever in both Input and Output directions. Cars now watch you, sense your body from the seat, detect your health and level of alertness. They then provide haptics, audio, and visual feedback. The only senses left out are taste and smell – and we've taken a few road trips where even those come into play! Some issues at this juncture: Driver engagement in semi-autonomous mode Automation – handover Distraction and cognitive load Multi-modal (voice, display, haptics) Gestures Varying cognitive loads (ex: stoplight v traffic) Are more/bigger screens really the answer? Information Architecture Design, delighting the customer Ease of use. Learnability Updates of HMI. Re-learnability Connected, V2I The Autotech Council's meeting on HMI and UX looks at the latest developments in the blurring space between man and machine, including drivers and passengers. We'll focus on UI, usability, features, safety, automation, and infotainment. Join us for this very interesting topic. For Members-Only: After lunch, Autotech Council will debrief the 2017 NAIAS among members. These debrief sessions mix news with opinions for those members who did not participate at either show and we encourage our participants to actively contribute their insights to each highlighted topic.


  • Date: 2/10/2017 08:30 AM
  • Location: Nvidia, Santa Clara (Map)

Description

Thank you to our Sponsors

SUMMARY   |   AGENDA   |   ATTENDEES  |   LIBRARY