6th March 2020: We are sorry to announce that NICE 2020, scheduled to be held on March 17-20 2020, will be postponed to a later date. Please see here for the new date in March 2021
NEUROTECH event: Future Application Directions for Neuromorphic Computing Technologies: agenda and registration (free, but mandatory). A half-day event with special focus on potential application of neuromorphic computing.
Travel info:
Getting to the venue:
the nearest tram stop to the meeting venue is "Heidelberg Bunsengymnasium" (marked in the map linked above) [online timetable]https://reiseauskunft.bahn.de//bin/query.exe/en?Z=Neuenheim+Bunsengymnasium,+Heidelberg), provided by German Railway. Here you can also buy tickets online
via Railway from the train station directly attached to the airport "Frankfurt Flughafen Fernbahnhof": online timetable by German Railway (tickets are also sold online via this website)
via airport shuttle service directly to the hotel. We have good experience with TLS Heidelberg. A single, shared ride costs about 40 Euro / person / ride
Hotels:
These hotels are relatively close to the meeting venue (Kirchhoff-Institute for Physics, see the map above). A lot more hotels are listed in online hotel booking sites (e.g. on booking.com)
Beyond Backprop: Different Approaches to Credit Assignment in Neural Nets
Backpropagation algorithm (backprop) has been the workhorse of neural net learning for several decades, and its practical effectiveness is demonstrated by recent successes of deep learning in a wide range of applications. This approach uses chain rule differentiation to compute gradients in state-of-the-art learning algorithms such as stochastic gradient descent (SGD) and its variations.
However, backprop has several drawbacks as well, including the vanishing and exploding gradients issue, inability to handle non-differentiable nonlinearities and to parallelize weight-updates across layers, and biological implausibility. These limitations continue to motivate exploration of alternative training algorithms, including several recently proposed auxiliary-variable methods which break the complex nested objective function into local subproblems. However, those techniques are mainly offline (batch), which limits their applicability to extremely large datasets, as well as to online, continual or reinforcement learning.
The main contribution of our work is a novel online (stochastic/mini-batch) alternating minimization (AM) approach for training deep neural networks, together with the first theoretical convergence guarantees for AM in stochastic settings and promising empirical results on a variety of architectures and datasets.
NICE 2020, Tutorials day: NOTE: NICE will be POSTPONED!
The tutorial day can be booked as one of the registration options. On the tutorial day hands-on interactive tutorials with several different neuromorphic compute systems will be offered:
Intel Loihi platform tutorial (Lecture style. To follow along from your own laptop your need to engage with Intel’s Intel’s Neuromorphic Research Community beforehand (email inrc_interest@intel.com for more information).