Jubilee 2019

Alex Powell

Vanderbilt University

Title:

Training low-bit neural networks

Abstract:

We discuss the problem of training neural networks with low-bit weights. This is motivated by applications where neural networks are trained on memory-constrained platforms. Our approach is based on stochastic Markov gradient descent (SMGD) and utilizes only low-bit weight vectors at every stage of the training process. We prove theoretical error bounds for SMGD and also show that the approach performs: well numerically. This is joint work with Jon Ashbrock.


Back to Jubilee 2019 speakers
Back to Jubilee 2019 home