Self-motion

Motion P3 demonstrates neural nature of motion ERPs

The technical challenges of recording electroencephalographic (EEG) data during motion are considerable, but would enable the possibility of investigating neural function associated with balance, motor function and motion perception. The challenges …

Bayesian integration of visual and vestibular signals for heading

Self-motion through an environment involves a composite of signals such as visual and vestibular cues. Building upon previous results showing that visual and vestibular signals combine in a statistically optimal fashion, we investigated the relative …

Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies

Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific …

Does jerk have to be considered in linear motion simulation?

Perceptual thresholds for the detection of the direction of linear motion are important for motion simulation. There are situations in which a subject should not perceive the motion direction as, e.g., during repositioning of a simulator, but also …

Acquisition of human EEG data during linear self-motion on a Stewart platform

The present study investigated the feasibility of acquiring electroencephalography (EEG) data during self-motion in human subjects. Subjects performed a visual oddball task - designed to evoke a P3 event-related potential - while being passively …

VISUAL VESTIBULAR INTERACTIONS FOR SELF MOTION ESTIMATION

Accurate perception of self-motion through cluttered environments involves a coordinated set of sensorimotor processes that encode and compare information from visual, vestibular, proprioceptive, motor-corollary, and cognitive inputs. Our goal was to …