Training the Untrainable: Learning to backpropagate through spikes and time

Javire Galván

Javier Galván is currently hired by CSIC working at IFISC

Broadcast soon

Recurrent Spiking Neural Networks (RSNNs) present a biologically plausible framework for modeling the dynamic computations of cortical circuits. However, their training remains a formidable challenge due to the intrinsic discontinuity of spike events and the temporal complexity of recurrent architectures. This talk surveys the state-of-the-art in gradient-based learning for RSNNs, with a focus on surrogate gradient methods—such as gradient flossing and adaptive gradient modulation— the emergence of vanishing and exploding gradients in temporal credit assignment, and the presentation of more biorealistic gradient-based strategies. Emphasis will also be placed on the mathematical underpinnings of loss functions tailored for spiking activity and their integration into scalable training schemes that respect biological constraints. 

 



Presential in the seminar room



Zoom stream:



https://us06web.zoom.us/j/98286706234?pwd=bm1JUFVYcTJkaVl1VU55L0FiWDRIUT09





 



Detalles de contacto:

Claudio Mirasso

Contact form


Esta web utiliza cookies para la recolección de datos con un propósito estadístico. Si continúas navegando, significa que aceptas la instalación de las cookies.


Más información De acuerdo