Bayesian Modeling of Motion Perception using Dynamical Stochastic Textures
Parameterization of the class of Motion Clouds stimuli. The illustration relates the parametric changes in MC with real world (top row) and observer (second row) movements. (A) Orientation changes resulting in scene rotation are parameterized through $\th$ as shown in the bottom row where a horizontal $a$ and obliquely oriented $b$ MC are compared. (B) Zoom movements, either from scene looming or observer movements in depth, are characterised by scale changes reflected by a scale or frequency term $\z$ shown for a larger or closer object $b$ compared to more distant $a$. (C) Translational movements in the scene characterised by $V$ using the same formulation for static (a) slow (b) and fast moving MC, with the variability in these speeds quantified by $\sr$. $(\xi$ and $\tau)$ in the third row are the spatial and temporal frequency scale parameters. The development of this formulation is detailed in the text.
|
reference
- Jonathan Vacher, Andrew Isaac Meso, Laurent U Perrinet, Gabriel Peyré. Bayesian Modeling of Motion Perception using Dynamical Stochastic Textures, URL URL2 . Neural Computation, 2018 abstract
.
All material (c) L. Perrinet. Please check the copyright notice.
This work was supported by ANR project "ANR Speed" ANR-13-BSHS2-0006.
 |
TagYear18 TagPublications TagMotionClouds TagAnrSpeed