What finite-additivity can add to decision theory
- PDF / 492,624 Bytes
- 27 Pages / 439.37 x 666.142 pts Page_size
- 24 Downloads / 186 Views
What finite-additivity can add to decision theory Mark J. Schervish1 Joseph B. Kadane1
· Teddy Seidenfeld2 · Rafael B. Stern3 ·
Accepted: 9 August 2019 © Springer-Verlag GmbH Germany, part of Springer Nature 2019
Abstract We examine general decision problems with loss functions that are bounded below. We allow the loss function to assume the value ∞. No other assumptions are made about the action space, the types of data available, the types of non-randomized decision rules allowed, or the parameter space. By allowing prior distributions and the randomizations in randomized rules to be finitely-additive, we prove very general complete class and minimax theorems. Specifically, under the sole assumption that the loss function is bounded below, we show that every decision problem has a minimal complete class and all admissible rules are Bayes rules. We also show that every decision problem has a minimax rule and a least-favorable distribution and that every minimax rule is Bayes with respect to the least-favorable distribution. Some special care is required to deal properly with infinite-valued risk functions and integrals taking infinite values. Keywords Admissible rule · Bayes rule · Complete class · Least-favorable distribution · Minimax rule Mathematics Subject Classification Primary 62C07; Secondary 62C20
Electronic supplementary material The online version of this article (https://doi.org/10.1007/s10260019-00486-6) contains supplementary material, which is available to authorized users.
B
Mark J. Schervish [email protected] Teddy Seidenfeld [email protected] Rafael B. Stern [email protected] Joseph B. Kadane [email protected]
1
Department of Statistics, Carnegie Mellon University, Pittsburgh, PA 15213, USA
2
Departments of Philosophy and Statistics, Carnegie Mellon University, Pittsburgh, PA 15213, USA
3
Statistics Department, Federal University of São Carlos, São Carlos 13565-905, Brazil
123
M. J. Schervish et al.
1 Introduction 1.1 Motivation The following example, adapted from Example 3 of Schervish et al. (2009), is a case in which countably-additive randomized rules do not contain a minimal complete class. It involves a discontinuous version of squared-error loss in which a penalty is added if the prediction and the event being predicted are on opposite sides of a critical cutoff. Example 1 A decision maker is going to offer predictions for an event B and its complement. The parameter space is Θ = {B, B C } while the action space is A = [0, 1]2 , pairs of probability predictions. The decision maker suffers the sum of two losses (one for each prediction) each of which equals the usual squared-error loss (square of the difference between indicator of event and corresponding prediction) plus a penalty of 0.5 if the prediction is on the opposite side of 1/2 from the indicator of the event. In symbols, the loss function equals L(θ, (a1 , a2 )) = (I B − a1 )2 + (I B C − a2 )2 1 I[0,1/2] (a1 ) + I(1/2,1] (a2 ) if θ = B, + 2 I(1/2,1] (a1 ) + I[0,1/2] (a2 ) if θ = B C . To keep matters simple, we ass
Data Loading...