- 人因研究专家正在研究如何在界面设计中简化信息处理和减少认知负担。无论是研发人员还是驾驶员都将从中受益。(Adobe Stock_Drazen)
- SmartEye Pro 眼动追踪系统有助于开发自适应人机界面系统,该系统可根据驾驶员的注意力进行动态调整。(iMotions)
- iMotions将情感分析融入人机界面设计流程,以实时了解驾驶员的情感和认知状态。(iMotions)
- iMotions将情感分析融入人机界面设计流程,以实时了解驾驶员的情感和认知状态。(iMotions)
- 将情感分析和眼动跟踪技术与先进的模拟器系统相结合,可以让设计人员将界面与人更紧密地结合起来。(iMotions)
- Nam Nguyen,iMotions技术合作经理兼高级神经科学产品专家。(iMotions)
iMotions利用神经科学与人工智能驱动的分析工具,增强了车载系统人机界面的追踪、评估和设计。
随着车辆安全和信息娱乐功能的不断增强,对现代商用车和工业车辆的车载人机界面(HMI)的评估也变得越来越重要。除了新技术的复杂性增加了驾驶员的学习难度之外,与先进驾驶辅助系统(ADAS)之间的交互也会增加驾驶员的认知负担并分散其注意力,无论是乘用车还是商用车都如此。
随着车辆自动化程度的不断提升,许多客户开始使用生物传感器技术来监测驾驶员的注意力以及各种系统和界面的影响。基于神经科学原理和AI技术,研发人员正在利用眼动追踪、面部表情和心率等数据设计出更有效的系统和界面,从而确保驾驶体验因自动化技术的进步得到改善,而非变得更糟糕。
车辆设计中的人机界面系统集成正在迅猛发展,并重点关注改善用户与车辆的交互体验。只有深刻理解人因工程,才能打造出安全、高效及易于操作的驾驶体验。iMotions和SmartEye等公司正在利用行为研究和眼动追踪技术引领全新的人机界面设计理念。
人体工学的核心在于以用户为中心的设计方法,即根据初始阶段的用户反馈设计出直观且令人满意的界面。该方法使用生物传感器实时记录用户的具体反应,从而克服了传统设计方法中用户反馈偏颇和不充分的问题。通过捕获这些即时反应,开发人员能够识别出区分有效设计和无效设计,从而设计出更易于操作且更重视用户体验的系统。
人机界面设计中的人为因素不仅涉及认知和物理方面,还涉及界面对用户的情绪和心理影响。其中包括通过理解设计元素如何影响用户的情绪和压力水平,从而设计出与用户建立积极情绪联系的界面,例如,利用颜色、形状和纹理影响用户对系统的感知和情绪反应。驾驶员面部表情监测、心电图和肌肉张力等工具可以为设计人员提供宝贵的信息,助其理解驾驶员和乘客对车内环境的反应。一旦确定了形式和功能上影响驾驶员的各种因素,便能对车内环境进行改善。
在模拟器中集成iMotions情绪分析和生物识别传感技术能够帮助设计师深入了解驾驶员的认知负荷、情绪状态和生理反应。iMotions软件采用了各种生物评估方法来研究驾驶员在模拟过程中对不同人机界面元素的反应,包括眼球运动、面部表情、心率变化和EEG(脑电图)。这些实时数据丰富了设计师对驾驶员行为的理解,助其确定人机界面设计中的哪些方面可以提高易用性,哪些会造成用户困惑。
由多摄像头Smart Eye Pro系统提供的先进眼动追踪技术使人机界面设计师和研究人员能够准确地监测驾驶员对车辆内部各区域的关注点和注视时长。该系统还展示了驾驶员如何通过3D线框建模与座舱进行交互,同时不会因可穿戴设备分散注意力。该技术有利于开发自适应人机界面系统——能够根据驾驶员的关注点进行动态调整的系统。
通过结合使用情绪分析和眼动追踪技术,以及Cruden公司的先进模拟器系统,设计师可增强对驾驶员身体和情绪状态的理解。这些技术的潜在协同作用蕴含着促进未来车载人机界面设计发展的巨大潜能。
本文由 iMotions技术合作经理兼高级神经科学产品专家Nam Nguyen撰写,并向SAE投稿。
The advancement of vehicles with enhanced safety and infotainment features has made evaluating human-machine interfaces (HMI) in modern commercial and industrial vehicles crucial. Drivers face a steep learning curve due to the complexities of these new technologies. Additionally, the interaction with advanced driver-assistance systems (ADAS) increases concerns about cognitive impact and driver distraction in both passenger and commercial vehicles.
As vehicles incorporate more automation, many clients are turning to biosensor technology to monitor drivers’ attention and the effects of various systems and interfaces. Utilizing neuroscientific principles and AI, data from eye-tracking, facial expressions and heart rate are informing more effective system and interface design strategies. This approach ensures that automation advancements improve rather than hinder the driving experience.
The integration of HMI systems in vehicle design is evolving rapidly, focusing on enhancing user-vehicle interactions. A deep understanding of human-factors engineering is essential for creating safe, efficient and user-friendly driving experiences. Companies like iMotions and SmartEye are using behavioral research and eye-tracking to pioneer new HMI design principles.
The role of neuroscience in HMI design
Human factors are critical in designing HMIs for transportation systems, with designs needing to accommodate the abilities and limitations of human drivers. This involves ergonomic design, cognitive psychology and user experience (UX) research to develop interfaces that are user-friendly, safe and intuitive.
The aim is to create systems that align with human behavior and cognitive processes, reducing errors and improving usability. Advances in neuroscience have greatly enhanced the tracking, assessment and design of HMIs, moving beyond traditional interview-based methods to more sophisticated, neuroscience-based techniques.
Modern approaches include using camera-based eye trackers like SmartEye Pro, AI-powered facial expression analysis tools like Affdex, and methods like electrodermal response and electrocardiography (ECG), once limited to research labs. These tools are now applied to develop commercial HMI applications, showcasing their relevance and utility in improving system design.
Vehicle HMI design must manage the driver’s cognitive load, the mental effort required in working memory. Human-factors experts aim to design interfaces that simplify information processing, reducing cognitive load to prevent confusion and potential hazards. This involves organizing information logically, minimizing complexity, and using visual and interactive hierarchies to highlight essential functions.
For example, non-essential controls like A/C, music and cruise control are placed on the steering wheel, allowing drivers to use them without looking away from the road. New features added to the dashboard are strategically positioned for safety and ease of use. iMotions software helps measure how long drivers divert their gaze from the road to interact with these tools.
Optimizing user-centric design
Ergonomics play a critical role in HMI design, focusing on the physical interaction between users and vehicle interiors. This includes the strategic placement of controls and displays, ensuring they are easy to operate and provide adequate feedback. A well-executed ergonomic design enhances user comfort and efficiency, reducing the likelihood of strain, errors and accidents. Designing for diverse users involves complex assessments traditionally conducted through iterative testing or focus groups.
The core of this approach lies in user-centered design, which incorporates user feedback from the initial stages to create intuitive and satisfying interfaces. This method helps overcome traditional design challenges like biased or inadequate feedback by using biosensors that record detailed reactions in real time. By capturing these instantaneous responses, developers can discern critical factors that differentiate effective from ineffective designs, leading to better, more user-focused systems.
Human factors in HMI design go beyond just cognitive and physical aspects. They also consider the emotional and psychological impact of interfaces on users. This involves understanding how design elements can affect mood and stress levels and designing interfaces that create positive emotional connections with the user. For instance, the use of color, shape and texture can influence a user’s perception and emotional response to a system. Tools such as driver facial expression monitoring, ECG and muscle tension can provide valuable insights into how drivers and passengers react to the cabin environment that can be improved once the individual elements of form and function are identified.
Safety is paramount in HMI design, where human-factors expertise is crucial for reducing errors and accidents. This requires designing interfaces that are straightforward, predictable and forgiving of user errors. Accessibility is also essential, ensuring interfaces are usable by people of various abilities, including those with disabilities.
Additionally, tools used for measuring user responses are now employed in developing driver monitoring systems. These systems detect states like drowsiness or distraction, helping to mitigate risky driving behaviors and enhance safety. Truck and car companies are increasingly utilizing these technologies to incorporate human-in-the-loop systems that improve overall driving safety.
Emotion analytics and advanced eye-tracking
iMotions specializes in integrating emotion analytics into the HMI design process. The approach aims to understand the emotional and cognitive states of drivers in real time, using advanced sensor technologies to capture data on eye movement, facial expressions and physiological responses. This data-driven approach helps designers recognize how drivers interact with various HMI elements, identifying areas of cognitive overload, distraction or stress.
By utilizing such technology, vehicle designers can make informed decisions about the layout, complexity and functionality of HMI systems. For instance, insights into gaze patterns can inform the optimal placement of critical information on dashboards, ensuring that drivers can access the information they need without diverting attention from the road. Similarly, monitoring physiological responses during interactions with infotainment systems can help in designing interfaces that minimize cognitive strain.
Integrating iMotions’ advanced emotion analytics and biometric sensing within simulators provides insights into drivers’ cognitive loads, emotional states and physiological responses. iMotions Software employs various biometric measurements, including eye movements, facial expressions, heart rate variability and EEG (electroencephalogram) to study how drivers respond to different HMI elements during simulations. This real-time data enriches understanding of driver behavior, helping designers pinpoint which aspects of HMI design enhance intuitive use and which could cause confusion.
The collaboration with Dutch simulator manufacturer Cruden enhances this process, enabling swift evaluations and adjustments. This method is actively used at the University of Michigan Dearborn’s Driving Simulator Lab, for example, to align interfaces more closely with human capabilities.
Advanced eye-tracking technology provided by the multi-camera Smart Eye Pro system allows HMI designers and researchers to accurately monitor where and for how long a driver looks at different areas of the vehicle interior. It also reveals how a driver interacts with the cabin in real time via 3D wireframe modeling without any distracting influence from wearables. This technology is beneficial in the development of adaptive HMI systems that can dynamically adjust based on the driver’s focus.
By combining the strengths of both emotion analytics and eye-tracking technology, and leveraging advanced simulator systems from Cruden, designers can gain a better understanding of the driver’s physical and emotional state. The potential synergy between these technologies holds great promise for the future of HMI design in vehicles.
Nam Nguyen, technical partnership manager and senior neuroscience product specialist for iMotions, contributed this article for SAE Media.
等级
打分
- 2分
- 4分
- 6分
- 8分
- 10分
平均分
- 作者:NAM NGUYEN
- 行业:汽车
- 主题:管理与产品开发零部件质量、可靠性与耐久性安全性