Exact Event-Driven Training for Efficient Neuromorphic Computing
This paper introduces an analytical, event-driven learning framework for training spiking neural networks (SNNs) that computes exact gradients for synaptic weights, transmission delays, and adaptive firing thresholds, improving accuracy and efficiency.
Why it matters
This work demonstrates that aligning neuron dynamics and training dynamics with event-sparse execution can simultaneously improve functional performance and resource efficiency in neuromorphic computing systems.
Key Points
- 1Proposes an event-driven training method for SNNs that computes exact gradients
- 2Enables precise control of synaptic weights, transmission delays, and firing thresholds
- 3Reduces on-chip memory traffic by up to 24x compared to dense time-step simulations
- 4Improves accuracy by up to 7% over surrogate-gradient baselines
- 5Enhances spike-timing precision and resilience to hardware noise
Details
Spiking neural networks (SNNs) offer potential efficiency gains over traditional neural networks by communicating with sparse, event-driven spikes rather than dense numerical activations. However, most SNN training pipelines rely on surrogate-gradient approximations or require dense time-step simulations, which conflict with the constraints of neuromorphic hardware and blur precise spike timing. This paper introduces an analytical, event-driven learning framework that computes exact gradients for synaptic weights, programmable transmission delays, and adaptive firing thresholds - three orthogonal temporal controls that jointly shape SNN accuracy and robustness. By propagating error signals only at spike events and integrating subthreshold dynamics in closed form, the method eliminates the need to store membrane-potential traces and reduces on-chip memory traffic by up to 24x. Across multiple sequential event-stream benchmarks, the framework improves accuracy by up to 7% over a strong surrogate-gradient baseline, while sharpening spike-timing precision and enhancing resilience to injected hardware noise.
No comments yet
Be the first to comment