|
|
|
 |
Search published articles |
 |
|
Showing 2 results for Kullback-Leibler Risk
Abdolreza Sayyareh, Volume 4, Issue 2 (3-2011)
Abstract
In this paper we have established for the Kullback-Leibler divergence that the relative error is supperadditive. It shows that a mixture of k rival models gives a better upper bound for Kullback-Leibler divergence to model selection. In fact, it is shown that the mixed model introduce a model which is better than of the all rival models in the mixture or a model which is better than the worst rival model in the mixture.
Ghobad Barmalzan, Abdolreza Sayyareh, Volume 4, Issue 2 (3-2011)
Abstract
Suppose we have a random sample of size n of a population with true density h(.). In general, h(.) is unknown and we use the model f as an approximation of this density function. We do inference based on f. Clearly, f must be close to the true density h, to reach a valid inference about the population. The suggestion of an absolute model based on a few obsevations, as an approximation or estimation of the true density, h, results a great risk in the model selection. For this reason, we choose k non-nested models and investigate the model which is closer to the true density. In this paper, we investigate this main question in the model selection that how is it possible to gain a collection of appropriate models for the estimation of the true density function h, based on Kullback-Leibler risk.
|
|
|
|
|
|
|