BindsNET: Spiking neural networks in PyTorch
We are developing a Python package used for simulating spiking neural networks (SNNs) built on top of the PyTorch neural networks library. For the purpose of our projects, we are interested in applying SNNs to machine learning (ML) problems, but the code can be used for any purpose (machine learning, biological neural network simulation, etc.). There are several advantages to building a SNN library on top of PyTorch:
- The flexible torch.Tensor object (a thin numpy.ndarray wrapper) implements many linear algebra computations on both CPUs and GPUs.
- The torch.nn library provides numerous efficient neural network layer operations suitable for constructing spiking neural networks.
- PyTorch syntax is user-friendly, easy to read, and compact, leading to simple and extensible implementations of spiking neural network components.
The BindsNET package is the first of its kind: a Python package implementing machine learning-oriented spiking neural networks seamlessly on CPU and GPU hardware. Optimizing for computational efficiency, only relatively simple neuron and synapse objects are considered; however, users may implement any desired neuronal and synpatic dynamics thanks to easily extensible and modular library structure. BindsNET includes submodules for the construction of SNNs, dataset loading and encoding into spike trains, generic plotting functionality, and network performance evaluation.