Department of Statistics
Florida State University
"Online Learning with Model Selection"
, 499 DSL
Abstract:
Current online learning methods suffer issues such as lower convergence rates and limited capability to recover the support of true features compared to their offline counterparts. In this talk, we present a novel framework for online learning based on running averages and introduce a series of online versions of some popular existing offline methods such as Elastic Net, Minimax Concave Penalty and Feature Selection with Annealing. We prove the equivalence between our online methods and their offline counterparts and give theoretical true feature recovery and convergence guarantees for some of them. In contrast to the existing online methods, the proposed methods can extract models with any desired sparsity level at any time. Numerical experiments indicate that our new methods enjoy high accuracy of true feature recovery and a fast convergence rate, compared with standard online and offline algorithms. We also show how the running averages framework can be used for model adaptation in the presence of varying-coefficient models. Finally, we present some applications to large datasets where again the proposed framework shows competitive results compared to popular online and offline algorithms.