Beyond Neuromorphics: Non-Cognitive Applications of SpiNNaker2
Christian Mayr (TU Dresden)
15:00‑15:30 (30 min)
15:30‑15:40 (10+5 min)
Online training of quantized weights on neuromorphic hardware with multiplexed gradient descent
Authros: Adam McCaughan, Cory Merkel, Bakhrom Oripov, Andrew Dienstfrey, Sae Woo Nam and Sonia Buckley.
Adam McCaughan (NIST)
15:45‑16:10 (25+5 min)
NEO: Neuron State Dependent Mechanisms for Efficient Continual Learning
Authors: Anurag Daram and Dhireesha Kudithipudi.
Continual learning is challenging for deep neural networks, mainly because of catastrophic forgetting, the tendency for accuracy on previously trained tasks to drop when new tasks are learned. Although several biologically-inspired techniques have been proposed for mitigating catastrophic forgetting, they typically require additional memory and/or computational overhead. Here, we propose a novel regularization approach that combines neuronal activation-based importance measurement with neuron state-dependent learning mechanisms to alleviate catastrophic forgetting in both task-aware and task-agnostic scenarios. We introduce a neuronal state-dependent mechanism driven by neuronal activity traces and selective learning rules, with storage requirements for regularization parameters that grow asymptotically slower with network size - compared to schemes that calculate weight importance, whose storage grows quadratically. The proposed model, NEO, is able to achieve performance comparable to other state-of-the-art regularization based approaches to catastrophic forgetting, while operating with a reduced memory overhead.
Anurag Daram (UTSA)
16:15‑16:25 (10+5 min)
Impact of Noisy Input on Evolved Spiking Neural Networks for Neuromorphic Systems
Authors: Karan Patel and Catherine Schuman.
Karan Patel (University of Tennessee Knoxville)
16:30‑16:35 (5 min)
Spotlight: Intel Neuromorphic Deep Noise Suppression Challenge
16:35‑17:30 (55 min)
Open mic / discussions
End of the first day of NICE
17:30‑18:00 (30 min)
(self organised travel to San Antonio downtown)
18:00‑20:30 (150 min)
Welcome reception in San Antonio downtown
(Travel on your own)
Wednesday, 12 April 2023
NICE 2023 - day 2
08:00‑08:30 (30 min)
08:30‑09:15 (45+5 min)
Keynote: Versatility, Efficiency, and Resilience in Large-Scale Neuromorphic Intelligence at the Edge
An Introduction to a Simulator for Super Conducting Optoelectronic Networks (Sim-SOENs)
This tutorial will suffice to impart a functional understanding of Sim-SOENs. Starting with the computational building blocks of SOEN neurons, we will cover the nuances and processing power of single dendrites, before building up to dendritic arbors within complex neuron structures. We will find it is straightforward to implement arbitrary neuron structures and even dendritic-based logic operations. Even at this single neuron level, we will already demonstrate efficacy on basic computational tasks. From there we will scale to network simulations of many-neuron systems, again with demonstrative use-cases. By the end of the tutorial, participants should be able to easily generate custom SOEN neuron structures and networks. These lessons will apply directly to researching in the computational paradigm that is to be instantiating on the burgeoning hardware of SOENs. Format: Examples and instructions will be given in the form of Jupyter Notebook tutorials (already well into development). If it is conducive to the conference environment, these notebooks may be available for download and use in real-time. If this latter format is the case, practice exercises can be derived for active learning.
N2A -- An IDE for neural modeling
N2A is a tool for editing and simulating large-scale/complex neural models. These are written in a simple equation language with object-oriented features that support component creation and reuse. The tool compiles these models for various hardware targets ranging from neuromophic devices to supercomputers. Format: The first hour will provide a general introduction to the integrated development environment (IDE) and cover basic use cases: model editing, running a simulation, sharing models via Git, and running parameter sweeps.
The second hour will cover the basic LIF class hierarchy, techniques for designing your own component set, and integration with Sandia's Fugu tool. Special Requirements: This will be a hands-on tutorial. N2A may be downloaded from https://github.com/frothga/n2a and run on your personal laptop.
A hands-on tutorial for online interactive use of the BrainScaleS neuromorphic compute system: from the first log-in via the EBRAINS Collaboratory to interactive emulation of small spiking neural networks. This hands-on tutorial is especially suitable for beginners (more advanced attendants are welcome as well). We are going to use the BrainScaleS tutorial notebooks for this event.
For using the BrainScaleS system during the tutorial (and also independently of the tutorial for own research, free of charge for evaluation) an EBRAINS account (also free of charge) is needed (get an EBRAINS account here).
More info on how to get started using BrainScaleS. Format: Introductory presentation, followed by interactive hands-on tutorials. The attendants of the tutorial can a webbrowser on their own laptops to execute and change provided tutorials and explore on their own. Attendants will be able to continue accessing the systems with a generous test-quota also after the event
Fugu Introductory Tutorial
The tutorial will cover the basic design and practice of Fugu, a software package for composing spiking neural algorithms. We will begin will an introductory presentation on the motivation, design, and limitations of Fugu. Then, we will do two deep dive, interactive tutorials using jupyter notebooks. The first will cover how to use Fugu with pre-existing components, we call Bricks. The second will cover how to build a custom brick to perform a particular algorithm. In this case, the algorithm we choose will be an 80-20 network. Format: Interactive