Talk page
Title:
The Bootstrap Learning Algorithm
Speaker:
Abstract:
Constructing and training the neural network depends on various types of Stochastic Gradient Descent (SGD) methods, with adaptations that help with convergence by boosting the speed of the gradient search. Convergence for existing algorithms requires a large number of observations to achieve high accuracy with certain classes of functions. We work with a different, non-curve-tracking technique with the potential of achieving better speeds of convergence. In this talk, the new idea of 'decoupling' hidden layers by bootstrapping and using linear stochastic approximation is introduced. By utilizing resampled observations, the convergence of this process is quick and requires a lower number of data points. This proposed bootstrap learning algorithm can deliver quick and accurate estimates. This boost in speed allows the approximation of classes of functions within a fraction of the observations required with traditional neural network training methods.
Link:
Workshop: