Elapsed time neural assemblies and learning processes for weak interconnections.

Abstract: Modeling neural networks is an interesting problem from both mathematical and neuroscience point of view. In particular, evolution equations describing neural assemblies derived from stochastic processes and microscopic models have become a very active area in recent years. In this context we study a neural network via time-elapsed dynamics with learning, where neurons are described by their position in brain cortex and the elapsed time since last discharge. The learning process is modeled via the change in the interconnections among neurons which defines a learning rule. A typical learning rule is for example that inspired in Hebbian Learning, which states that if two neurons fire together then their connection will be stronger. We model these dynamics as a system of integro-differential equations, from which we study the convergence to stationary states by the means of entropy method and Doeblin’s theory. Moreover we present some numerical simulations to observe how the parameters of system can give different pattern formations.

Date: Dec 01, 2020 at 16:00:00 h
Venue: Modalidad Vía Online.
Speaker: Nicolás Torres Escorza
Affiliation: Lab. Jacques-Louis Lions.
Coordinator: Claudio Muñoz
More info at:
Event website
Abstract:
PDF

Posted on Nov 18, 2020 in Differential Equations, Seminars