Hooking together the pieces to build a neuroscience demo

Hooking together the pieces to build a (simplistic) neuroscience demo

  • During this talk, I wanted to make 2 points:
    • Science is fun: we need to expose interesting scientific results to students, kids and more generally to a wide audience. Here, I show how to build a simplistic demo. Few lines of code, relatively little effort to create a cross-platform, interactive demo.
    • It is still not wide-spread in the computational neuroscience community to design real-time simulations. is there some insight we may have from such an example?

http://www.yorku.ca/eye/

  • In this demo project I hook together a neuron, a webcam and a loudspeaker interact in (approx) real-time. The goal is to conduct in computo what was done in vivo by Hubel & Wiesel. This was chosen since it is a well-known scientific that is fundamental in the sense that it links the response of a neuron to a stimulus in visual space (flashing a bar on a screen). It raises the question of what is represented by the neural activity: In the original experiment, ON and OFF subfields of the receptive field are directly marked on the screen by symbols. But how can we be sure that the spiking that we hear physically really corresponds to a neural representation, especially when this response is just one rumor in a whole, intricately connected recurrent network?

  • The principle is that when you launch the script, the input flow coming from your iSight gets converted through a dummy retina. The membrane of the neuron is directly excited by the instantaneous correlation coefficient between the RF and the present image, but modulated by a response curve "à la" Laughlin (1981).


TagFacets TagTalks TagYear10

welcome: please sign in