|
|
|
 |
Search published articles |
 |
|
Showing 1 results for Mixture of Models
Abdolreza Sayyareh, Volume 4, Issue 2 (3-2011)
Abstract
In this paper we have established for the Kullback-Leibler divergence that the relative error is supperadditive. It shows that a mixture of k rival models gives a better upper bound for Kullback-Leibler divergence to model selection. In fact, it is shown that the mixed model introduce a model which is better than of the all rival models in the mixture or a model which is better than the worst rival model in the mixture.
|
|
|
|
|
|
|