Shin, Seung Jun and Artemiou, Andreas ORCID: https://orcid.org/0000-0002-7501-4090
2017.
Penalized principal logistic regression for sparse sufficient dimension reduction.
Computational Statistics & Data Analysis
111
, pp. 48-58.
10.1016/j.csda.2016.12.003
|
Preview |
PDF
- Accepted Post-Print Version
Download (550kB) | Preview |
Abstract
Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and extend it to a penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.
| Item Type: | Article |
|---|---|
| Date Type: | Publication |
| Status: | Published |
| Schools: | Schools > Mathematics |
| Subjects: | Q Science > QA Mathematics |
| Uncontrolled Keywords: | Max-SCAD penalty; Principal logistic regression; Sparse sufficient dimension reduction; Sufficient dimension reduction |
| Publisher: | Elsevier |
| ISSN: | 0167-9473 |
| Date of First Compliant Deposit: | 9 December 2016 |
| Date of Acceptance: | 5 December 2016 |
| Last Modified: | 18 Jan 2025 22:45 |
| URI: | https://orca.cardiff.ac.uk/id/eprint/96679 |
Citation Data
Cited 10 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
![]() |
Edit Item |





Dimensions
Dimensions