文件名称:On_The_Value_of_Leave-One-Out_Cross-Validation
文件大小:155KB
文件格式:PDF
更新时间:2016-11-18 07:07:52
leave one out
A long-standing problem in classication is the determination of the regularization parameter. Nearly every classication algorithm uses a parameter (or set of parameters) to control classier complexity. Crossvalidation on the training set is usually done to determine the regularization parameter(s). [1] proved a leave-one-out cross-validation (LOOCV) bound for a class of kernel classiers. [2] extended the bound to Regularized Least Squares Classication (RLSC). We provide the (trivial) extension to multiclass. Our contribution is empirical work. We evaluate the bound's usefulness as a selector for the regularization parameter for RLSC. We nd that it works extremely poorly on the data set we experimented with (20 Newsgroups); the LOOCV bound consistently selects a regularization parameter that is too large.