Create Activity: Iterative Design Development

Between workshops, development took place to refine, combine, and iterate the concept designs; to ensure they could provide a complete solution; and to ensure that they were suitable and feasible for further development. These activities took many forms, including the development of storyboard prototypes, team reviews, and iteration to remove inconsistencies and feasibility issues, and to ensure the concepts could deliver a complete meaningful HMI interaction over all of the prescribed scenarios. An example storyboard section is shown in Figure 2.17 for the voice autopilot assistant concept in the planning phase of the drive, when the vehicle is interrogating the driver to understand their route priorities.

Evaluate Activity: Testing with Experts and Users - Overview

The developed design concepts were user-tested and further iterated during expert evaluations, and paper interface evaluations with participants, and driving simulator studies were carried out both in Cambridge and in a higher fidelity driving simulator at Southampton. This work is reported in multiple chapters in detail elsewhere.

Create Activity: Design Workshop 2

A second workshop was undertaken to address specific HMI design issues using a design constraints approach. This enabled the insights gained from the ‘Evaluate’ activities to be condensed into a shortlist which was then used to inspire detailed design solutions.

Input

Thirty-one design constraints were identified from the previous Lo-Fi (referring to the Cambridge-based STISIM seat, steering wheel, pedals, and computer monitor screen set-up) and Hi-Fi (referring to the Southampton JLR vehicle-based) driving simulator studies and meetings (see Table 2.2). These constraints captured general guidance such as ‘24. Eyes out’, to facilitate the driver looking out of the vehicle, and more specific such as ‘11. Cue driver to grab steering wheel’, to provide an explicit instruction to the driver for takeover actions, as well as to the source of the constraints.

Outputs

The workshop participants focused on developing embodiment solutions from the design constraints (see Figure 2.18). These embodiments came in a wide range of forms, from abstract ideas - such as takeovers coming in different ‘sizes’ - to specific implementations for takeover control - such as the steering wheel which is pushed for autonomous mode and pulled for manual mode.

Six emergent themes were identified from the workshop:

1. Driver’s responsibility:

a. Clarity of driver responsibility and risk,

b. Feedback on driver action for various situations,

c. Displays of control and degrees of control, control salience (clear indication who is in control),

d. Ability to ignore takeover request, or speed-up takeover protocol, self-paced takeover.

2. Customisation/personalisation of takeover:

a. Slider, checklist, dial elements to select pre-sets,

b. Based on person (age, readiness, time out of loop, experience), situation (e.g. road), values of manufacturer,

c. Adaptability of the above based on context.

Exemplar design solution embodiments relating to specific design constraints

FIGURE 2.18 Exemplar design solution embodiments relating to specific design constraints.

TABLE 2.2

Design Constraints Used as Input to Design Workshop 2

Design Related Constraints

Lo-Fi

Hi-Fi

1.

Allow driver to take control at any point during takeover, be sure hand on wheel/feet on pedals

X

X

2.

Personalise takeover based on driver preferences (and situation)

X

X

3.

Allow option to complete task (even if it means miss takeover for junction/exit

X

X

4.

Allow sufficient time for takeover (big individual differences)

X

5.

Customise takeover based on duration of being outside of the control loop and frequency of takeover (and context: road, weather etc) multimodal HMI

X

6.

Querying situation awareness of driver by 'vehicle avatar’

X

7.

Make explicit who is in control of vehicle - mode awareness HMI (light up steering wheel)

X

X

8.

Recommended settings based on customer profiles for customisation

X

9.

Pre-set defaults for takeover

X

10.

Graduated alert to takeover visual, audio, haptic (escalating)

X

11.

Cue driver to 'grab’ steering wheel

X

12.

Make ‘take over button’ easy to access (e.g. put on gear stick)

X

13.

■Repeat’ button ’OK’ button?

X

14.

Encourage (facilitate) visual checks in environment and controls of vehicle

X

15.

Display the vehicle status and intention

X

16.

Driver’s HMI actions need to be clearly fed back. (Link to 1 - Volvo hands on wheel to flip both paddles)

X

Education and Training

17.

Education of drivers in rationale and technique

X

18.

Training (video) before being able to use autopilot on roads

X

General

19.

Older drivers do not like to constantly monitor automation for takeover (timer only) - trend only

X

20.

Differences between user preference and rankings of usefulness

X

21.

Characteristics of modality - HMI (blinky tape not noticed in peripheral vision)

X

22.

Synchronise multimodal cues. Combining or single modality

X

23.

Longitudinal studies

X

Additional Constraints Identified at Design Meeting

24.

Eyes out

25.

Use system to aid manual driving

26.

Some level of personalisation and setting of levels

27.

Longer vehicle to driver takeover in urban environment (compared to motorway)

28.

Takeover strategy that guides visual search

29.

Feedback to every driver action (process needs adapting to driver and situation)

30.

Checklist

31.

Option to request specific information of importance to driver (if not in protocol)

3. Car personification/avatar:

a. Inquiring about readiness for takeover,

b. Going through a checklist for required actions before takeover (e.g. verbally).

4. Wheel interventions:

a. Affordances to grab wheel, cues to grab wheel, displays on wheel, wheel movement (push wheel for autonomous/pull for manual),

b. Paddles.

5. Ambient displays:

a. For inside the car or outside the car,

b. Head-up display cues for traffic,

c. Augmented reality on windscreen,

d. To guide visual search.

6. Input and output modalities:

a. Voice,

b. User Interface (UI) elements to initiate, check completion, and move between takeover steps (e.g. next, previous button, dial, trackpad, toggle switch),

c. Communicating essential information and environmental conditions before and during takeover (e.g. part of trip, time of day, road conditions, next junction, hazards, type of road).

Create Activity: Final Concepts and Refinement

The final HMI interface tested at the JLR simulator at the HMI laboratory, attempted to embody the design constraints, and presented the emergent themes from the workshop (see Figure 2.19). This interface showed the potential for the approach and provided a testbed for carrying out experimentation to determine participant

Overview of JLR HMI driving simulator mocked up with Hi:DAVe autonomous HMI as used for participant trials shown in manual driving mode, and with all output modes active

FIGURE 2.19 Overview of JLR HMI driving simulator mocked up with Hi:DAVe autonomous HMI as used for participant trials shown in manual driving mode, and with all output modes active. In addition to the visible displays, audible instructions and tones, and seat vibrations were available for personalisation.

personalisation preferences for the output modes, including displays, and haptic and audible feedback. In addition, eye-tracking cameras were fitted to provide gaze information to help understand which displays were being employed.

Details of the studies and results carried out using the HMI simulator are available in other HI:DAVe papers.

The HMI demonstrated in the Jaguar I-PACE for the on-road study was largely the same embodiment as for the JLR simulator (see Figures 2.20-2.22). A more complete picture of the interactions during handover to vehicle and takeover by driver can be seen in Appendix 1, which shows the high level of ‘hand-holding’ and mode salience that the interface provides to the user. This feature was intrinsic to the inclusive nature of the interface, so that users who were not familiar, nor confident in the use of the interface, would not need to rely on their cognitive skills to work out how the system might be activated or deactivated, nor to require to recall such knowledge. In addition, the HMI supported the modes of control that all drivers are familiar with to resume control at all times, such as grabbing the steering wheel to change

Jaguar I-PACE Hi:DAVe autonomous HMI as used for the on-road trials shown in manual mode

FIGURE 2.20 Jaguar I-PACE Hi:DAVe autonomous HMI as used for the on-road trials shown in manual mode (‘You Are in Control’ orange text, orange ambient lighting, and colour coding on displays), but when autonomous driving has become available (‘Automation Available’ white text).

Jaguar I-PACE Hi:DAVe autonomous HMI instrument cluster display detail shown just after manual mode has been activated

FIGURE 2.21 Jaguar I-PACE Hi:DAVe autonomous HMI instrument cluster display detail shown just after manual mode has been activated.

Jaguar I-PACE Hi:DAVe autonomous HMI head-up display detail shown when in manual mode and demonstrating the explicit instructions for the driver to activate autonomous mode when automation i

FIGURE 2.22 Jaguar I-PACE Hi:DAVe autonomous HMI head-up display detail shown when in manual mode and demonstrating the explicit instructions for the driver to activate autonomous mode when automation is available.

direction, and pressing the brake to slow down. It was recognised that when all the display modalities were deployed that this represented a significant (and potentially overwhelming) degree of redundancy, and so personalisation settings were an experimental priority for both the simulator and the on-road trials. These studies enabled the participants to choose which displays were active and the level of haptic and auditory feedback provided.

 
Source
< Prev   CONTENTS   Source   Next >