Visual tracking of ambiguous moving objects: A recursive Bayesian model

Abstract

Perceptual and oculomotor data demonstrate that, when the visual information about an object’s motion differs on the local (edge-related) and global levels, the local 1D motion cues dominate initially, whereas 2D information takes progressively over and leads to the final correct representation of global motion. Previous models have explained the initial errors (deviations from the global motion) in terms of best perceptual guess in the Bayesian sense. These models accounted for the intrinsic sensory noise of the image and general expectancies for object velocities. Here we propose a recursive extension of the Bayesian model, with the purpose of encompassing the whole dynamical evolution of motion processing, from the 1D cues to the correct global motion. Our model is motivated and constrained by smooth pursuit oculomotor data. Eye movements were recorded in 3 participants using the scleral search coil technique. Participants were asked to track either a single line (vertical or oblique) or a Gaussian blob moving horizontally. In our model, oculomotor data obtained with non ambiguous stimuli (e.g. with coherent local and global information, such as a Gaussian blob or a vertical line moving horizontally) are combined to constrain the initial likelihood and prior functions for the general, ambiguous case (e.g. a tilted line moving horizontally). The prior knowledge is then recursively updated by using the previous posterior probability as the current prior. The idea is that the recursive injection of posterior distribution boosts the spread of information about the object’s shape, favoring the integration of 1D and 2D cues. In addition, a simple model of the sensory-oculomotor loop is taken into account, including transmission delays and the evolution of the retinal motion during pursuit. Preliminary results show substantial agreement between the model prediction and the oculomotor data.

Publication
Journal of Vision