-
- Ohnishi Toshio
- Faculty of Economics, Kyushu University
-
- Yanagimoto Takemi
- Department of Industrial and Systems Engineering, Chuo University
この論文をさがす
抄録
Two Bayesian prediction problems in the context of model averaging are investigated by adopting dual Kullback-Leibler divergence losses, the e-divergence and the m-divergence losses. We show that the optimal predictors under the two losses are shown to satisfy interesting saddlepoint-type equalities. Actually, the optimal predictor under the e-divergence loss balances the log-likelihood ratio and the loss, while the optimal predictor under the m-divergence loss balances the Shannon entropy difference and the loss. These equalities also hold for the predictors maximizing the log-likelihood and the Shannon entropy respectively under the e-divergence loss and the m-divergence loss, showing that enlarging the log-likelihood and the Shannon entropy moderately will lead to the optimal predictors. In each divergence loss case we derive a robust predictor in the sense that its posterior risk is constant by minimizing a certain convex function. The Legendre transformation induced by this convex function implies that there is inherent duality in each Bayesian prediction problem.
収録刊行物
-
- JOURNAL OF THE JAPAN STATISTICAL SOCIETY
-
JOURNAL OF THE JAPAN STATISTICAL SOCIETY 43 (1), 29-55, 2013
日本統計学会
- Tweet
キーワード
詳細情報 詳細情報について
-
- CRID
- 1390001205286530688
-
- NII論文ID
- 10031185799
-
- NII書誌ID
- AA1105098X
-
- ISSN
- 13486365
- 18822754
-
- MRID
- 3154717
-
- NDL書誌ID
- 024763350
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- NDL
- Crossref
- CiNii Articles
- KAKEN
-
- 抄録ライセンスフラグ
- 使用不可