Authors: Xiang He, Dongcheng Zhao, Yang Li, Guobin Shen, Qingqun Kong, Yi Zeng
Published on: March 23, 2023
Impact Score: 8.22
Arxiv code: Arxiv:2303.13077
Summary
- What is new: The introduction of a knowledge transfer loss comprising domain alignment loss and spatio-temporal regularization, alongside a sliding training strategy for improving SNN generalization on event-based datasets.
- Why this is important: Spiking neural networks suffer from overfitting and limited performance on event-based datasets due to smaller data scales and less annotation.
- What the research proposes: Using static images to aid SNN training on event data, incorporating domain alignment and spatio-temporal regularization to solve feature distribution inconsistencies.
- Results: Improved performance on neuromorphic datasets (N-Caltech101, CEP-DVS, N-Omniglot) over state-of-the-art methods.
Technical Details
Technological frameworks used: nan
Models used: Spiking Neural Networks (SNNs)
Data used: Static images and event-based neuromorphic datasets (N-Caltech101, CEP-DVS, N-Omniglot)
Potential Impact
Neuromorphic computing markets, companies in AI and machine learning, particularly those developing or utilizing spiking neural networks.
Want to implement this idea in a business?
We have generated a startup concept here: NeuroShift.
Leave a Reply