TITLE:
Revisiting Akaike’s Final Prediction Error and the Generalized Cross Validation Criteria in Regression from the Same Perspective: From Least Squares to Ridge Regression and Smoothing Splines
AUTHORS:
Jean Raphael Ndzinga Mvondo, Eugène-Patrice Ndong Nguéma
KEYWORDS:
Linear Model, Mean Squared Prediction Error, Final Prediction Error, Generalized Cross Validation, Least Squares, Ridge Regression
JOURNAL NAME:
Open Journal of Statistics,
Vol.13 No.5,
September
28,
2023
ABSTRACT: In
regression, despite being both aimed at estimating the Mean Squared Prediction
Error (MSPE), Akaike’s Final Prediction Error (FPE) and the Generalized Cross
Validation (GCV) selection criteria are usually derived from two quite
different perspectives. Here, settling on the most commonly accepted definition
of the MSPE as the expectation of the squared prediction error loss, we provide
theoretical expressions for it, valid for any linear model (LM) fitter, be it
under random or non random designs. Specializing these MSPE expressions for
each of them, we are able to derive closed formulas of the MSPE for some of the
most popular LM fitters: Ordinary Least Squares (OLS), with or without a full
column rank design matrix; Ordinary and Generalized Ridge regression, the
latter embedding smoothing splines fitting. For each of these LM fitters, we
then deduce a computable estimate of the MSPE which turns out to coincide with
Akaike’s FPE. Using a slight variation, we similarly get a class of MSPE
estimates coinciding with the classical GCV formula for those same LM fitters.