Spiking Neural Networks: From LIF to Multisynaptic Models — A Comprehensive Review

Abstract

Spiking Neural Networks (SNNs) have emerged as a promising paradigm for energy-efficient neuromorphic computing, leveraging event-driven computation to mimic the information processing mechanisms of biological brains. This review provides a comprehensive overview of recent advances in SNN research, with a focus on two fundamental neuron models: the classical Leaky Integrate-and-Fire (LIF) model and the recently proposed Multisynaptic Firing (MSF) model. We examine key research achievements from both domestic and international institutions, spanning theoretical foundations, hardware implementations, and practical applications. The mathematical formulations of both models are presented in detail, highlighting their computational principles and biological plausibility. Finally, we discuss future directions and open challenges in the field.


1. Introduction

Spiking Neural Networks represent a paradigm shift from traditional artificial neural networks (ANNs) by incorporating temporal dynamics and event-driven computation . Unlike conventional neurons that produce continuous activation values, SNN neurons communicate through discrete spikes—electrical impulses that occur only when the membrane potential exceeds a threshold. This event-driven nature enables SNNs to achieve remarkable energy efficiency, making them particularly attractive for edge computing, neuromorphic hardware, and real-time processing applications .

The evolution from ANNs to SNNs can be conceptualized as a four-stage structural transformation: (1) binary activation functions, (2) introduction of temporal dimension, (3) temporal accumulation mechanisms, and (4) reset and sparsity control . At the heart of this evolution lies the neuron model, which determines how information is integrated, stored, and transmitted. Among the various neuron models proposed in the literature, the Leaky Integrate-and-Fire (LIF) model remains the most widely adopted due to its mathematical tractability and computational efficiency . However, recent advances have led to more sophisticated models, such as the Multisynaptic Firing (MSF) neuron, which addresses fundamental limitations of single-threshold models by enabling simultaneous encoding of spatial and temporal information [Fan et al., 2025].

This review aims to provide a systematic overview of SNN research, with particular emphasis on the mathematical foundations of LIF and MSF models, recent breakthroughs from leading research groups worldwide, and future directions for the field.


2. Global Research Landscape

2.1 International Research Achievements

Computational Neuroscience Foundations. International research institutions have made significant contributions to the theoretical understanding of SNNs. The generalized integrate-and-fire (GIF) framework, developed by researchers including Mensi et al. (2012) and Pozzorini et al. (2015), provides a unified mathematical description of neuronal dynamics with multiple adaptation mechanisms . This framework, implemented in the NEST simulator, supports multiple receptor ports with independent exponential synaptic time constants, spike-triggered currents (STC), and spike-frequency adaptation (SFA) .

Hardware Innovations. The Fraunhofer Institute for Integrated Circuits (IIS) has developed SENNA, a programmable neuromorphic chip designed specifically for SNN acceleration . With 1,024 artificial neurons, SENNA operates directly with spike-based input and output signals, achieving nanosecond-level processing of data streams. The chip is complemented by a software development kit that supports hardware-aware DNN-to-SNN knowledge distillation, enabling seamless deployment of SNN models .

Whole-Brain Simulation. A landmark achievement in 2026 came from Eon Systems, a Silicon Valley company that successfully uploaded the complete brain of an adult fruit fly into a digital replica. Using LIF-based neuron models and electron microscopy reconstruction of the connectome, the digital brain was integrated with EPFL’s NeuroMechFly v2 virtual body, demonstrating spontaneous behaviors such as walking, grooming, and foraging that emerged directly from circuit dynamics rather than explicit training.

Optical Computing Paradigms. Researchers at Koç University in Turkey have developed an optical spiking neural network based on phase modulation. By exploiting the statistical properties of optical rogue waves for spike generation and controlling light diffraction through spatial light modulators (SLMs), the system achieved 82.45% accuracy on BreastMNIST and 95.00% on Olivetti Faces datasets, demonstrating the potential of optical SNNs for low-power inference.

2.2 Domestic Research Achievements

Novel Neuron Models. Chinese researchers have made groundbreaking contributions to SNN theory. The Multisynaptic Firing (MSF) neuron model, proposed by Fan, Shen, Lian, Li, Yao, Li, and Hu from the National University of Defense Technology and the Institute of Automation, Chinese Academy of Sciences, represents a fundamental advance in spiking neuron design [Fan et al., 2025]. Published in Nature Communications, the MSF model introduces multiple thresholds that enable simultaneous encoding of stimulus intensity (through spike count) and temporal dynamics (through spike timing). Experimental results demonstrate a 32% improvement in texture detail preservation for image reconstruction and a 15% accuracy gain over traditional SNNs in event stream processing tasks.

Hardware Breakthroughs. Professor Yang Yuchao’s research group at Peking University’s School of Integrated Circuits has achieved significant advances in memristor-based SNN hardware . Published in Nature Electronics, their work introduces a “hybrid dynamics” synaptic unit combining interfacial volatile memristors (W/ɑ-IGZO/MgO/W) with non-volatile 1T1R memristors. This design implements Fatigue STDP (spike-timing-dependent plasticity) learning rules in hardware, enabling robust operation across frequencies from 10 Hz to 500 kHz with energy efficiency of 0.298 TOPS/W .

Application Innovations. The New Finance Comprehensive Laboratory at Southwestern University of Finance and Economics has developed SpikeAdapter, an SNN-based framework for remote sensing interpretation, accepted at CVPR 2026 . The framework encodes radiation differences in bi-temporal remote sensing images as sparse spike sequences with Time-to-First-Spike (TTFS) characteristics, achieving biologically inspired modeling of change detection .


3. Neuron Models: Mathematical Foundations

3.1 The Leaky Integrate-and-Fire (LIF) Model

The LIF model describes the dynamics of a neuron’s membrane potential through a first-order differential equation :

τ m d V ( t ) d t = − ( V ( t ) − V rest ) + R m I ( t ) \tau_m \frac{dV(t)}{dt} = -(V(t) - V_{\text{rest}}) + R_m I(t) τmdtdV(t)=(V(t)Vrest)+RmI(t)

where:

  • V ( t ) V(t) V(t) is the membrane potential at time t t t (mV)
  • V rest V_{\text{rest}} Vrest is the resting potential (mV)
  • τ m = R m C m \tau_m = R_m C_m τm=RmCm is the membrane time constant (ms)
  • R m R_m Rm is the membrane resistance (MΩ)
  • I ( t ) I(t) I(t) is the input current (nA)

When the membrane potential exceeds a threshold V th V_{\text{th}} Vth, the neuron fires a spike and the potential is reset:

V ( t + ) = V reset V(t^+) = V_{\text{reset}} V(t+)=Vreset

followed by an absolute refractory period t ref t_{\text{ref}} tref during which the neuron ignores inputs .

For numerical simulation, the Euler-Maruyama discretization is commonly employed :

V t + Δ t = V t + Δ t τ m [ − ( V t − V rest ) + R m I ( t ) ] V_{t+\Delta t} = V_t + \frac{\Delta t}{\tau_m}\left[-(V_t - V_{\text{rest}}) + R_m I(t)\right] Vt+Δt=Vt+τmΔt[(VtVrest)+RmI(t)]

The LIF model can be viewed as a temporal extension of the binary threshold neuron, where Eq. (3) in Stage I of the four-stage framework represents the binary spiking formulation :

s ( x ) = { 1 , x > 1 0 , otherwise s(x) = \begin{cases} 1, & x > 1 \\ 0, & \text{otherwise} \end{cases} s(x)={1,0,x>1otherwise

3.2 The Multisynaptic Firing (MSF) Model

The MSF model addresses a fundamental limitation of single-threshold neurons: the inability to simultaneously encode stimulus intensity (spatial information) and temporal dynamics [Fan et al., 2025]. Inspired by the biological phenomenon of multisynaptic connections—where a single axon forms multiple synapses with the same postsynaptic neuron—the MSF model introduces D D D parallel firing thresholds [Fan et al., 2025]:

Θ d = V thresh + ( d − 1 ) ⋅ h , d = 1 , 2 , … , D \Theta_d = V_{\text{thresh}} + (d-1) \cdot h, \quad d = 1, 2, \ldots, D Θd=Vthresh+(d1)h,d=1,2,,D

where:

  • V thresh V_{\text{thresh}} Vthresh is the base threshold (mV)
  • h h h is the threshold interval (typically 1.0 mV)
  • D D D is the maximum number of synapses (multi-threshold count)

The membrane potential dynamics follow the same differential equation as LIF:

τ m d V ( t ) d t = − ( V ( t ) − V rest ) + R m I total ( t ) \tau_m \frac{dV(t)}{dt} = -(V(t) - V_{\text{rest}}) + R_m I_{\text{total}}(t) τmdtdV(t)=(V(t)Vrest)+RmItotal(t)

However, at each time step, the neuron can emit multiple spikes depending on how many thresholds are exceeded:

S ( t ) = ∑ d = 1 D 1 [ V ( t ) ≥ Θ d ] S(t) = \sum_{d=1}^{D} \mathbb{1}[V(t) \geq \Theta_d] S(t)=d=1D1[V(t)Θd]

where 1 [ ⋅ ] \mathbb{1}[\cdot] 1[] is the indicator function. This yields:

  • S ( t ) = 0 S(t) = 0 S(t)=0 if V ( t ) < Θ 1 V(t) < \Theta_1 V(t)<Θ1
  • S ( t ) = 1 S(t) = 1 S(t)=1 if Θ 1 ≤ V ( t ) < Θ 2 \Theta_1 \leq V(t) < \Theta_2 Θ1V(t)<Θ2
  • S ( t ) = 2 S(t) = 2 S(t)=2 if Θ 2 ≤ V ( t ) < Θ 3 \Theta_2 \leq V(t) < \Theta_3 Θ2V(t)<Θ3
  • S ( t ) = D S(t) = D S(t)=D if V ( t ) ≥ Θ D V(t) \geq \Theta_D V(t)ΘD

When spikes occur ( S ( t ) > 0 S(t) > 0 S(t)>0), the membrane potential is reset:

V ( t + ) = V reset V(t^+) = V_{\text{reset}} V(t+)=Vreset

followed by a refractory period.

In multilayer networks, the input current to an MSF neuron is computed from the spike outputs of presynaptic neurons:

I total ( t ) = ∑ j ∈ pre w i j ⋅ S j ( t ) I_{\text{total}}(t) = \sum_{j \in \text{pre}} w_{ij} \cdot S_j(t) Itotal(t)=jprewijSj(t)

where w i j w_{ij} wij is the synaptic weight (shared across all D D D synapses) and S j ( t ) S_j(t) Sj(t) is the spike count from presynaptic neuron j j j (0 to D D D).

The MSF model unifies two classical models as special cases [Fan et al., 2025]:

  • When D = 1 D = 1 D=1, MSF reduces to the standard LIF neuron
  • When simulation time step T = 1 T = 1 T=1 and D → ∞ D \to \infty D, MSF approximates the ReLU neuron mathematically

3.3 Comparison of Encoding Capabilities

The fundamental difference between LIF and MSF lies in their information encoding strategies. LIF neurons employ rate coding, where information is encoded in the average firing rate over time—this trades temporal resolution for intensity information. MSF neurons, through their multi-threshold mechanism, achieve instantaneous rate coding: spike count within a single time step encodes intensity, while precise spike timing captures temporal dynamics [Fan et al., 2025].

This dual encoding capability enables MSF networks to:

  • Process information with significantly fewer time steps
  • Maintain both spatial and temporal feature representations
  • Achieve higher accuracy on tasks requiring simultaneous processing of intensity and timing [Fan et al., 2025]

4. Future Perspectives

4.1 Hardware-Software Co-Design

The full potential of SNNs can only be realized through tight integration of algorithms and hardware. The Fraunhofer approach of hardware-aware DNN-to-SNN knowledge distillation represents a promising direction , while the Peking University hybrid memristor array demonstrates how device-level innovations can implement complex learning rules directly in hardware . Future research should focus on developing standardized interfaces between software frameworks and neuromorphic chips, enabling seamless deployment across different hardware platforms.

4.2 Advanced Learning Algorithms

Training SNNs remains challenging due to the non-differentiability of spike generation . Current approaches include surrogate gradient methods, ANN-to-SNN conversion, and biologically inspired plasticity rules such as STDP. The Fatigue STDP mechanism demonstrated by Yang’s group offers a hardware-native learning rule with built-in frequency adaptation . Future work should explore hybrid approaches combining the strengths of multiple learning paradigms.

4.3 Scaling to Biological Realism

The Eon Systems fruit fly brain simulation demonstrates the feasibility of whole-brain emulation at the insect scale. Scaling to mammalian brains will require advances in multiple directions: more efficient neuron models (such as MSF), improved hardware capabilities, and better understanding of neural circuit organization. The MSF model’s parameter efficiency (no increase in parameters despite multiple synapses) makes it particularly attractive for large-scale simulations [Fan et al., 2025].

4.4 Applications and Commercialization

SNNs are poised to impact numerous application domains:

  • Edge AI: Low-power sensors and IoT devices
  • Autonomous systems: Real-time perception and decision-making
  • Biomedical signal processing: EEG/ECG analysis with temporal precision
  • Remote sensing: Change detection and environmental monitoring

The Fraunhofer knowledge distillation approach lowers barriers for companies with existing DNN investments, potentially accelerating SNN adoption in industrial applications .

4.5 Theoretical Foundations

Despite significant progress, many theoretical questions remain open. The four-stage framework proposed by recent work provides a useful lens for understanding SNN architecture evolution , but deeper mathematical understanding of spike-based computation, information theory, and learning dynamics is needed. Numerical analysis of LIF networks under Euler-Maruyama discretization offers insights into error bounds and convergence properties , but similar analyses for more complex models like MSF are still lacking.


5. Conclusion

Spiking Neural Networks represent a convergence of neuroscience inspiration and engineering innovation. From the classical LIF model, which remains the workhorse of practical SNN implementations , to the recently proposed MSF model with its multi-threshold encoding capabilities [Fan et al., 2025], the field continues to advance through contributions from research groups worldwide. Hardware breakthroughs from Fraunhofer IIS and Peking University demonstrate the feasibility of energy-efficient neuromorphic systems, while applications in remote sensing and beyond showcase practical utility.

The journey from biological inspiration to practical deployment is far from complete. Challenges remain in training algorithms, hardware integration, and theoretical understanding. Yet the rapid pace of progress—exemplified by whole-brain simulation at the insect scale and memristor-based learning hardware—suggests that SNNs will play an increasingly important role in the future of artificial intelligence, particularly for applications demanding energy efficiency and temporal processing capabilities.


References

  1. Springer. (2026). A Four-Stage Structural Evolution Framework for Spiking Neural Networks: A Review and Perspective from Binary ANN to Event-Driven Models. Neural Processing Letters, 58, 14.

  2. Li, S., Xie, X., Zhan, Q., Wang, L., Deng, Y., & Liu, G. (2026). Sparsely Timing the Change: A Spiking Temporal Framework for Remote Sensing Interpretation. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2026.

  3. Dou, X., et al. (2026). Numerical analysis for leaky-integrate-fire networks under Euler–Maruyama. arXiv preprint arXiv:2603.10854.

  4. BrainPy Documentation. (2026). gif_psc_exp_multisynapse – Current-based generalized integrate-and-fire neuron model with multiple synaptic time constants.

  5. Springer. (2026). Impact of Neuron Models on Spiking Neural Network Performance: A Complexity-based Classification Approach. Neuroinformatics, 24, 5.

  6. Dang, B., Yang, Y., et al. (2026). Spiking neural networks with fatigue spike-timing-dependent plasticity learning using hybrid memristor arrays. Nature Electronics.

  7. NEST Simulator Documentation. (2026). gif_psc_exp_multisynapse – Current-based generalized integrate-and-fire neuron model with multiple synaptic time constants.

  8. Fraunhofer IIS. (2026). Spiking Performance in AI Applications: Transforming Deep Neural Networks into Spiking Neural Networks.

  9. Fan, L., Shen, H., Lian, X., Li, Y., Yao, M., Li, G., & Hu, D. (2025). A multisynaptic spiking neuron for simultaneously encoding spatiotemporal dynamics. Nature Communications, 16, 7155. DOI: 10.1038/s41467-025-62251-6

  10. Ullah, S., Koravuna, S., Rückert, U., & Jungeblut, T. (2026). A comprehensive analysis of spiking neural networks and their mathematical models. Frontiers in Computational Neuroscience.

Logo

有“AI”的1024 = 2048,欢迎大家加入2048 AI社区

更多推荐