Treder, Matthias S. ORCID: https://orcid.org/0000-0001-5955-2326 2018. Cross-validation in high-dimensional spaces: a lifeline for least-squares models and multi-class LDA. [Online]. arXiv. Available at: http://arxiv.org/abs/1803.10016 |
Preview |
PDF
- Submitted Pre-Print Version
Download (1MB) | Preview |
Abstract
Least-squares models such as linear regression and Linear Discriminant Analysis (LDA) are amongst the most popular statistical learning techniques. However, since their computation time increases cubically with the number of features, they are inefficient in high-dimensional neuroimaging datasets. Fortunately, for k-fold cross-validation, an analytical approach has been developed that yields the exact cross-validated predictions in least-squares models without explicitly training the model. Its computation time grows with the number of test samples. Here, this approach is systematically investigated in the context of cross-validation and permutation testing. LDA is used exemplarily but results hold for all other least-squares methods. Furthermore, a non-trivial extension to multi-class LDA is formally derived. The analytical approach is evaluated using complexity calculations, simulations, and permutation testing of an EEG/MEG dataset. Depending on the ratio between features and samples, the analytical approach is up to 10,000x faster than the standard approach (retraining the model on each training set). This allows for a fast cross-validation of least-squares models and multi-class LDA in high-dimensional data, with obvious applications in multi-dimensional datasets, Representational Similarity Analysis, and permutation testing.
Item Type: | Website Content |
---|---|
Date Type: | Submission |
Status: | Submitted |
Schools: | Computer Science & Informatics |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Publisher: | arXiv |
Last Modified: | 23 Nov 2024 00:15 |
URI: | https://orca.cardiff.ac.uk/id/eprint/115994 |
Actions (repository staff only)
Edit Item |