Time: Tuesday, Nov 28, 2017, 2:00pm @ MTH1311

Speaker: Ethan Stump (ARL)

Title: Parsimonious Online Learning with Kernels and Connections with Deep Learning

Abstract: We consider classes of machine learning problems, particularly those in reinforcement learning, where having algorithms and representations that place a premium on sample efficiency is critical for practical success. Though they have fallen out of favor in the wake of advances in deep neural networks, classical techniques based on Reproducing Kernel Hilbert Spaces (RKHSs) are the quintessential nonparametric approach but present challenges in an online learning setting due to their growing model complexity. We study application of kernel techniques to online problems in two ways. First, we present a simple technique and associated analysis for controlling this growth through ongoing projections in an approach that we call Parsimonious Online Learning with Kernels (POLK). Though these results are interesting, however, our naïve implementations do not adequately address complex input domains such as images. So, second, we approach this problem by exploring a dormant research area on designing hierarchical Mercer kernels for vision tasks, and discuss some recent efforts to modernize these designs and build implementations with modern neural network tools. By making use of spectral kernel approximation theory, these architectures directly connect convolutional neural networks with Mercer kernels and give a glimpse at an alternate interpretation of the parameters and hyperparameters of deep neural networks. Back to seminar