Skip to main content

Applied Ergonomics

ISSN: 0003-6870

Visit Journal website

Datasets associated with articles published in Applied Ergonomics

Filter Results
1970
2024
1970 2024
14 results
  • The centrality of workers for sustainability based on values: exploring ergonomics to introduce new rationalities for decision making process
    This is a list of paper that are the base of our paper
    • Dataset
  • Data for: Effects of cognitive and visual loads on driving performance after take-over request (TOR) in automated driving
    Data for: Effects of cognitive and visual loads on driving performance after take-over request (TOR) in automated driving
    • Dataset
  • Supplementary data for the paper 'Once a driver, always a driver—Manual driving style persists in automated driving takeover'
    As automated vehicles require human drivers to resume control in critical situations, predicting driver takeover behaviour could be beneficial for safe transitions of control. While previous research has explored predicting takeover behaviour in relation to driver state and traits, little work has examined the predictive value of manual driving style. We hypothesised that drivers’ behaviour during manual driving is predictive of their takeover behaviour when resuming control from an automated vehicle. We assessed 38 drivers with varying experience in a high-fidelity driving simulator. After completing manual driving sessions to assess their driving style, participants performed an automated driving task, typically on a subsequent date. Measures of driving style from manual driving sessions, including headway and lane change speed, were found to be predictive of takeover behaviour. The level of driving experience was associated with the behavioural measures, but correlations between measures of manual driving style and takeover behaviour remained after controlling for driver experience. Our findings demonstrate that how drivers reclaim control from their automated vehicle is not an isolated phenomenon but is associated with manual driving behaviour and driving experience. Strategies to improve takeover safety and comfort could be based on driving style measures, for example by the automated vehicle adapting its behaviour to match a driver’s driving style.
    • Dataset
  • Supplementary data for the paper 'Once a driver, always a driver—Manual driving style persists in automated driving takeover'
    As automated vehicles require human drivers to resume control in critical situations, predicting driver takeover behaviour could be beneficial for safe transitions of control. While previous research has explored predicting takeover behaviour in relation to driver state and traits, little work has examined the predictive value of manual driving style. We hypothesised that drivers’ behaviour during manual driving is predictive of their takeover behaviour when resuming control from an automated vehicle. We assessed 38 drivers with varying experience in a high-fidelity driving simulator. After completing manual driving sessions to assess their driving style, participants performed an automated driving task, typically on a subsequent date. Measures of driving style from manual driving sessions, including headway and lane change speed, were found to be predictive of takeover behaviour. The level of driving experience was associated with the behavioural measures, but correlations between measures of manual driving style and takeover behaviour remained after controlling for driver experience. Our findings demonstrate that how drivers reclaim control from their automated vehicle is not an isolated phenomenon but is associated with manual driving behaviour and driving experience. Strategies to improve takeover safety and comfort could be based on driving style measures, for example by the automated vehicle adapting its behaviour to match a driver’s driving style.
    • Dataset
  • Dense 3D pressure discomfort threshold (PDT) map of the human head, face and neck [Dataset]
    This dataset contains a dense 3D pressure discomfort threshold (PDT) map of the human head, face and neck of a mixed population (n=28), mapped on an average head model of all the participants. Datasets on nonNormalised and Normalised are available. To aid designers and engineers, this PDT is also mapped on the human head Statistical Shape Model (SSM) (from Principal Component 1 to PC50, ±3σ) built on the CAESAR 3D Anthropometric Database (USA, Italy and The Netherlands, male and female, 18-65y, n=4309). Files are available as *.vtk for further analysis, and as *.obj for further use as reference models for design engineering in e.g. CAD.
    • Dataset
  • Dense 3D pressure discomfort threshold (PDT) map of the human head, face and neck [Dataset]
    This dataset contains a dense 3D pressure discomfort threshold (PDT) map of the human head, face and neck of a mixed population (n=28), mapped on an average head model of all the participants. Datasets on nonNormalised and Normalised are available. To aid designers and engineers, this PDT is also mapped on the human head Statistical Shape Model (SSM) (from Principal Component 1 to PC50, ±3σ) built on the CAESAR 3D Anthropometric Database (USA, Italy and The Netherlands, male and female, 18-65y, n=4309). Files are available as *.vtk for further analysis, and as *.obj for further use as reference models for design engineering in e.g. CAD.
    • Dataset
  • Supplementary data for the paper ‘Stopping by looking: A driver-pedestrian interaction study in a coupled simulator using head-mounted displays with eye-tracking'
    Automated vehicles (AVs) can perform low-level control tasks but are not always capable of proper decision-making. This paper presents a concept of eye-based maneuver control for AV-pedestrian interaction. Previously, it was unknown whether the AV should conduct a stopping maneuver when the driver looks at the pedestrian or looks away from the pedestrian. A two-agent experiment was conducted using two head-mounted displays with integrated eye-tracking. Seventeen pairs of participants (pedestrian and driver) each interacted in a road crossing scenario. The pedestrians’ task was to hold a button when they felt safe to cross the road, and the drivers’ task was to direct their gaze according to instructions. Participants completed three 16-trial blocks: (1) Baseline, in which the AV was pre-programmed to yield or not yield, (2) Look to Yield (LTY) in which the AV yielded when the driver looked at the pedestrian, and (3) Look Away to Yield (LATY) in which the AV yielded when the driver did not look at the pedestrian. The driver’s eye movements in the LTY and LATY conditions were visualized using a virtual light beam. A performance score was computed based on whether the pedestrian held the button when the AV yielded and released the button when the AV did not yield. Furthermore, the pedestrians’ and drivers’ acceptance of the mappings was measured through a questionnaire. The results showed that the LTY and LATY mappings yielded better crossing performance than Baseline. Furthermore, the LTY condition was best accepted by drivers and pedestrians. Eye-tracking analyses indicated that the LTY and LATY mappings attracted the pedestrian’s attention, but pedestrians adequately distributed their attention between the AV and a second vehicle approaching from the other direction. In conclusion, LTY control may be a promising means of AV control at intersections before full automation is technologically feasible.
    • Dataset
  • Supplementary data for the paper 'What driving style makes pedestrians think a passing vehicle is driving automatically?'
    An important question in the development of automated vehicles (AVs) is which driving style AVs should adopt and how other road users perceive them. The current study aimed to determine which AV behaviours contribute to pedestrians’ judgements as to whether the vehicle is driving manually or automatically as well as judgements of likeability. We tested five target trajectories of an AV in curves: playback manual driving, two stereotypical automated driving conditions (road centre tendency, lane centre tendency), and two stereotypical manual driving conditions, which slowed down for curves and cut curves. In addition, four braking patterns for approaching a zebra crossing were tested: manual braking, stereotypical automated driving (fixed deceleration), and two variations of stereotypical manual driving (sudden stop, crawling forward). The AV was observed by 24 participants standing on the curb of the road in groups. After each passing of the AV, participants rated whether the car was driven manually or automatically, and the degree to which they liked the AV’s behaviour. Results showed that the playback manual trajectory was considered more manual than the other trajectory conditions. The stereotype automated ‘road centre tendency’ and ‘lane centre tendency’ trajectories received similar likeability ratings as the playback manual driving. An analysis of written comments showed that curve cutting was a reason to believe the car is driving automatically, whereas driving at a constant speed or in the centre was associated with automated driving. The sudden stop was the least likeable way to decelerate, but there was no consensus on whether this behaviour was manual or automated. It is concluded that AVs do not have to drive like a human in order to be liked.
    • Dataset
  • Supplementary data for the paper 'How should external Human-Machine Interfaces behave? Examining the effects of colour, position, message, activation distance, vehicle yielding, and visual distraction among 1,434 participants'
    External human-machine interfaces (eHMIs) may be useful for communicating the intention of an automated vehicle (AV) to a pedestrian, but it is unclear which eHMI design is most effective. In a crowdsourced experiment, we examined the effects of (1) colour (red, green, cyan), (2) position (roof, bumper, windshield), (3) message (WALK, DON’T WALK, WILL STOP, WON’T STOP, light bar), (4) activation distance (35 or 50 m from the pedestrian), and (5) the presence of visual distraction in the environment, on pedestrians' perceived safety of crossing the road in front of yielding and non-yielding AVs. Participants (N = 1434) had to press a key when they felt safe to cross while watching a random 40 out of 276 videos of an approaching AV with eHMI. Results showed that (1) green and cyan eHMIs led to higher perceived safety of crossing than red eHMIs; no significant difference was found between green and cyan, (2) eHMIs on the bumper and roof were more effective than eHMIs on the windshield, (3) for yielding AVs, perceived safety was higher for WALK compared to WILL STOP, followed by the light bar; for non-yielding AVs, a red bar yielded similar results to red text, (4) for yielding AVs, a red bar caused lower perceived safety when activated early compared to late, whereas green/cyan WALK led to higher perceived safety when activated late compared to early, and (5) distraction had no significant effect. We conclude that people adopt an egocentric perspective, that the windshield is an ineffective position, that the often-recommended colour cyan may have to be avoided, and that eHMI activation distance has intricate effects related to onset saliency.
    • Dataset
  • Supplementary data for the paper 'How should external Human-Machine Interfaces behave? Examining the effects of colour, position, message, activation distance, vehicle yielding, and visual distraction among 1,434 participants'
    External human-machine interfaces (eHMIs) may be useful for communicating the intention of an automated vehicle (AV) to a pedestrian, but it is unclear which eHMI design is most effective. In a crowdsourced experiment, we examined the effects of (1) colour (red, green, cyan), (2) position (roof, bumper, windshield), (3) message (WALK, DON’T WALK, WILL STOP, WON’T STOP, light bar), (4) activation distance (35 or 50 m from the pedestrian), and (5) the presence of visual distraction in the environment, on pedestrians' perceived safety of crossing the road in front of yielding and non-yielding AVs. Participants (N = 1434) had to press a key when they felt safe to cross while watching a random 40 out of 276 videos of an approaching AV with eHMI. Results showed that (1) green and cyan eHMIs led to higher perceived safety of crossing than red eHMIs; no significant difference was found between green and cyan, (2) eHMIs on the bumper and roof were more effective than eHMIs on the windshield, (3) for yielding AVs, perceived safety was higher for WALK compared to WILL STOP, followed by the light bar; for non-yielding AVs, a red bar yielded similar results to red text, (4) for yielding AVs, a red bar caused lower perceived safety when activated early compared to late, whereas green/cyan WALK led to higher perceived safety when activated late compared to early, and (5) distraction had no significant effect. We conclude that people adopt an egocentric perspective, that the windshield is an ineffective position, that the often-recommended colour cyan may have to be avoided, and that eHMI activation distance has intricate effects related to onset saliency.
    • Dataset
1