Fast cross-validation

Yong Liu, Hailun Lin, Lizhong Ding, Weiping Wang, Shizhong Liao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Scopus citations


Cross-validation (CV) is the most widely adopted approach for selecting the optimal model. However, the computation of CV has high complexity due to multiple times of learner training, making it disabled for large scale model selection. In this paper, we present an approximate approach to CV based on the theoretical notion of Bouligand influence function (BIF) and the Nyström method for kernel methods. We first establish the relationship between the theoretical notion of BIF and CV, and propose a method to approximate the CV via the Taylor expansion of BIF. Then, we provide a novel computing method to calculate the BIF for general distribution, and evaluate BIF for sample distribution. Finally, we use the Nyström method to accelerate the computation of the BIF matrix for giving the finally approximate CV criterion. The proposed approximate CV requires training only once and is suitable for a wide variety of kernel methods. Experimental results on lots of datasets show that our approximate CV has no statistical discrepancy with the original CV, but can significantly improve the efficiency.

Original languageEnglish (US)
Title of host publicationProceedings of the 27th International Joint Conference on Artificial Intelligence, IJCAI 2018
EditorsJerome Lang
PublisherInternational Joint Conferences on Artificial Intelligence
Number of pages7
ISBN (Electronic)9780999241127
StatePublished - Jan 1 2018
Event27th International Joint Conference on Artificial Intelligence, IJCAI 2018 - Stockholm, Sweden
Duration: Jul 13 2018Jul 19 2018

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
ISSN (Print)1045-0823


Conference27th International Joint Conference on Artificial Intelligence, IJCAI 2018

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this