Should I stay or should I go? Humans adapt to the volatility of visual motion properties, and know about it


Animal behavior must constantly adapt to changes, for example when the state of an environmental context changes unexpectedly. For an agent that interacts with this volatile setting, it is important to react accurately and as quickly as possible. For example, it has already been shown that when a random sequence of directions of motion to the right or left of a visual target is suddenly biased to one direction, human observers adapt to accurately anticipate it with their eye movements. Here, we prove that this ability extends to a volatile environment where probability biases could change at random switching times. In addition, we also recorded the level of confidence reported by human observers. These results were compared to those of a probabilistic agent that is optimal in relation to the event switching generating model. Compared to other models such as the leaky integrator, we found a better match between the behavioral response observed and that given by this agent. Furthermore, we were also able to fit the experimental data with different levels of switching volatility in the model and derive a common marker for the inter-variability of participants, by titrating their level of preference between exploration and exploitation. Such results prove that in such an unstable environment, human observers can still effectively represent an internal belief, and use this representation in their sensory-motor control system and for explicit judgments. This work offers an innovative approach to more generically test human cognitive abilities in uncertain and dynamic environments.

May 23, 2019 1:30 AM
Colloque international de la Société Française des Neurosciences 2019
Marseille (France)
Laurent U Perrinet
Laurent U Perrinet
Researcher in Computational Neuroscience

My research interests include Machine Learning and computational neuroscience applied to Vision.