Digital Theses Archive


Tesi etd-05182022-122707

Type of thesis
Corso Ordinario Secondo Livello
Models of the sleep/wake alternation in artificial and neural networks
Cl. Sc. Sperimentali - Ingegneria
relatore Prof. MAZZONI, ALBERTO
  • neuromorphic engineering
  • sleep
Exam session start date
Sleep remains a ubiquitous yet unclear function in all the animals observed up to date. Shedding light on this function would not only solve a fundamental neurobiological question but could help to replicate the neurocomputational benefits of sleep in artificial systems such as neural networks. Therefore, this work aims at investigating the function of sleep from a neuromorphic engineering perspective, to engineer computational models of this phenomenon. <br><br>To do so, the major challenge is merging two scales: the micro-scale (the spatio-temporal scale of single neural cells and their spiking behavior) and the macro-scale (the scale of the whole brain and the alternation between sleep stages). The proposed work explores both scales. The long-term objective is to give a comprehensive thalamocortical model of the sleep-wake alternation and its neurocomputational features. <br><br>The starting point is the model of a single neuron, as a building block of large-scale networks. Indeed, the first part of the thesis addresses a neuromorphic engineering problem which is essential for building sleep networks: it is required to optimize the solver of the neural differential equations in terms of accuracy and computational effort. I adopted the Izhikevich artificial spiking neuron model which is among the most employed models in neuromorphic engineering and computational neuroscience. To realize a compromise between error and computational expense to solve numerically the model&#39;s equations I investigated the effects of discretization and studied the solver that realizes the best compromise between accuracy and computational cost, given an available amount of Floating-Point Operations per Second (FLOPS). I considered three fixed-step solvers for Ordinary Differential Equations (ODE), commonly used in computational neuroscience: Euler method, the Runge-Kutta 2 method and the Runge-Kutta 4 method. To quantify the error produced by the solvers, I used the Victor Purpura spike train Distance from an ideal solution of the ODE. Counterintuitively, I found that simple methods such as Euler and Runge Kutta 2 can outperform more complex ones (i.e., Runge Kutta 4) in the numerical solution of the Izhikevich model if the same FLOPS are allocated in the comparison. Moreover, I quantified the neuron rest time (OFF period) necessary for the numerical solution to converge to the ideal solution and therefore to cancel the error accumulated during the spike train; in this analysis, I found that the required rest time is independent from the firing rate and the spike train duration. These results can generalize straightforwardly to other spiking neuron models and provide a systematic analysis of fixed-step neural ODE solvers towards an accuracy-computational cost tradeoff. Moreover, this OFF period analysis is crucial in sleep models given that in computational models of sleep neurons go OFF for long periods. <br><br>These first results address novel ways to implement spiking neural networks. Some of the parts of the thesis have been published at the 10th International EMBS Neural Engineering Conference and other parts are in course of submission to a full-journal paper (they are temporarily available on BioRxiv at the following link https://www.biorxiv.org/content/10.1101/2021.11.30.470474v1.full). <br><br>The second step is to merge the micro with the macro scale, building a comprehensive sleep model based on thousands of Izhikevich spiking neurons as primitive blocks. The drive for sleep needs to be identified to insert this feature in each of the Izhikevich neurons to generate sleep as an emergent property of the whole system. One of the main neurocomputational sleep functions could be to renormalize synapses, to maintain the same overall levels of excitation-inhibition and therefore of neural activity in the brain (synaptic homeostasis) as the Synaptic Homeostasis Hypothesis of sleep proposes. However, the mechanism by which this happens is still unknown. Moreover, it is unclear how this hypothesis fits with the Achermann and Borbely sleep model, which describes sleep need as the combination of a circadian component and a homeostatic component (the one that increases with sleep deprivation). My model proposes that synaptic growth is inhibited by the limited presence of resources and a self-inhibitory feedback signal, as first proposed by Alan Turing in a seminal paper in other biological settings. The results, which are proposed together with analytical considerations, show that the model can explain at least three sleep phenomena that at the moment are still unclarified: 1) the NREM-REM alternation; 2) the effects of sleep deprivation on the NREM sleep duration; 3) and the fact that during late sleep the REM phases are more frequent and last more. All these phenomena are caused by the interplay of a drive for synaptic growth with an inhibitory feedback signal, which generates: 1) the alternation of phases of synaptic decrease (NREM) and phases of synaptic buildup (REM); 2) the fact that an increased amount of inhibitor, i.e. sleep deprivation, increases the duration of NREM sleep and 3) the fact that late sleep is characterized by inhibitor depletion and therefore increased drive for synaptic buildup (REM sleep). <br><br>Future directions of the work will be to unify the two parts (the higher level NREM-REM bistability model and the simulation of large-scale networks using the Izhikevich model) to build an account of sleep which is detailed both at the synaptic and the electrophysiological level.<br>