top of page

OurResearch

Multisensory integration

and aging

The aging brain provides a novel model through which to study the naturally‐occurring, global reductions in sensory system reliabilities that occur gradually with advancing age (e.g. loss of visual acuity, reduced hearing abilities, reduction in vestibular hair cells). Further, there may be changes to central multisensory integrative processes with age that could provide interesting insights into changes in multisensory integration across the lifespan.Our recent research has shown that in the context of self‐motion perception older adults do combine and integrate sensory inputs differently than do younger adults. Specifically, relative to younger adults, older adults demonstrate greater performance benefits under congruent multisensory conditions compared to unisensory cue conditions. However, they also demonstrate a greater tendency to integrate extraneous and/or largely incongruent sensory inputs leading to non‐optimal sensory integration under some conditions.

Multisensory
Cognitive

sensory-cognitive interactions

and aging

Our research extends  mobility‐related research by studying walking and driving mobility under realistic and challenging conditions and by considering the interactions among sensory, motor, and cognitive declines/impairments rather than focusing on each independently. For instance, we have advanced knowledge in applied health topics including: describing sensory and cognitive factors associated with safe driving performance in older adults; identifying how individuals with age‐related hearing loss cope with dual‐task demands to support safe walking mobility; understanding how children with cochleovestibular loss can stay balanced using multisensory compensatory strategies; and revealing how individuals with stroke integrate sensory‐motor information to perceive their position as upright.

hearing

and mobility

Individuals with age-related hearing loss commonly report limitations to mobility, are more likely to have difficulty walking, and are at significantly greater risk of falling compared to their normal-hearing peers. The link between hearing loss and these limitations is not well understood. Our research has pursued several lines of empirical investigation to better characterize the link between hearing loss, mobility, and falls during realistic everyday challenges. This research is motivated by at least three non-exclusive hypotheses, including: 1) hearing loss taxes cognitive resources thereby limiting the resources available to support safe mobility, 2) hearing loss causes problems with spatial orientation because binaural cues are unreliable thereby leading to instability 3) there are shared age-related pathologies in the auditory and vestibular systems. We conduct our studies using simulators with motion platform and employ high precision motion capture systems to measure kinematic responses during different tasks. The hope is that by better understanding the link between hearing loss and mobility during realistic challenges this will help identify those at risk of mobility impairments and falls and establish better, more ecologically valid methods of assessment and intervention.

hearing

motion sickness 

AND simulator sickness

Motion sickness is a common phenomenon during travel and is typically characterized by a variety of symptoms such as nausea general, fatigue, or general discomfort. Simulator sickness is a sensation very similar to motion sickness that occurs when using virtual applications such as driving or flight simulators, even in the absence of actual motion. Our research focuses on minimizing the occurrence and the severity of simulator sickness by investigating non-medical countermeasures that do not interact with the users cognitive functions. Promising techniques to reduce simulator sickness have been found in our studies and include music, smell, and postural control. 

motionsickness

self-motion perception 

and vection

We have conducted a comprehensive collection of studies to understand how visual, proprioceptive, and vestibular information are used and integrated for different aspects of self‐motion. This has included characterizing multisensory perceptions of velocity, heading direction, distance, and vertical orientation, across different populations (e.g., younger adults, older adults, individuals with stroke, hearing loss, and cognitive decline). We use psychophysical approaches, behavioural and neurophysiological measures, and computational modelling techniques. These studies are unique in their aim to quantify the relative weights of different sources of sensory information during active forms of self-motion (e.g. walking, driving) and during passive forms of self‐motion (e.g. during visually-induced motion (vection) or passive motion in motion simulators). We also try to understand how these and other processes contribute the experience of simulator sickness.

vection

Driving 

and Aging

Driving simulators are powerful tools for use in research and applications concerned with the evaluation and improvement of driving performance and the design of vehicles and in-vehicle technologies. The Toronto Rehabilitation Institute houses DriverLab, a motion-based driving simulator with unique features. The research objectives of DriverLab will be to develop sensitive methods for driver assessment, more effective methods for driver training, mitigation of drowsy and distraction driving, the effects of prescription and illicit drugs on driving, and a careful evaluation of the interactions between drivers and integrated-vehicle technologies. For each of these objectives we will be evaluating driving performance across a wide range of populations (individuals of different ages, with sensory, cognitive, or physical impairment) and across a wide range of driving scenarios.

Driving
bottom of page