An Introduction to Approximate Kernel Regression Methods
Nonlinear kernel regression models are often used in statistics and machine learning because they typically have better predictive accuracy than linear models. Variable selection for kernel regression models is a challenge partly because, unlike the linear regression setting, there is no clear concept of an effect size for regression coefficients. In this talk, we present a novel framework that provides an effect size analog for each explanatory variable in Bayesian kernel regression models when the nonlinear kernel function is shift-invariant. We will also propose the idea of a post-hoc unified approach for marginal variable selection, and the detection of significant interactions, via distributional “centrality measures” using Küllback-Leibler (KL) divergence.
Lorin Crawford is an Assistant Professor of Biostatistics, and a member of the Center for Statistical Sciences (CSS) and Center for Computational Molecular Biology at Brown. His scientific research interests involve the development of novel and efficient statistical and computational methodologies to address complex problems in quantitative genetics, cancer pharmacology, molecular genomics, and radiogenomics (i.e. cancer imaging).