702

A Generalized Predictive Criterion for Model Selection

Mario Trottini and Fulvio Spezzaferri

Abstract:

Given a sample of n i.i.d. observations generated from some unknown distribution F, assume that F belongs to one of two parametric models M1, M2, and that the estimation of the density of a future observation is of interest. San Martini and Spezzaferri (1984) proposed for this problem a predictive criterion based on the Kullback Leibler divergence between the estimate and the density of the future observation. In this paper a generalization of this criterion is presented both using a general class of divergences and relaxing the assumption that the true model belongs to M1 or M2.



Keywords: Model Selection, Gaussian Process, Loss Function, $\alpha$-divergences.



Heidi Sestrich
10/1/1999
Here is the full postscript text for this technical report. It is 595612 bytes long.