More Realism Please: Introduction to Multiparameter Models

Real-world problems nearly always require statistical models with more than one unknown quantity. However, usually only one, or a few, parameters or predictions are of substantive interest. Our analysis of mercury concentrations in fish tissue provides a

  • PDF / 181,140 Bytes
  • 10 Pages / 439.36 x 666.15 pts Page_size
  • 86 Downloads / 259 Views

DOWNLOAD

REPORT


More Realism Please: Introduction to Multiparameter Models

Real-world problems nearly always require statistical models with more than one unknown quantity. However, usually only one, or a few, parameters or predictions are of substantive interest. Our analysis of mercury concentrations in fish tissue provides a simple, but nevertheless typical, example. We may be primarily interested in the population mean of log mercury concentration, but of course we don’t really know the value of the population variance σ 2 . Therefore, in a realistic model, we must treat σ 2 as an unknown parameter along with μ . Frequentists often refer to unknown parameters that are not of substantive interest as “nuisance parameters.” Bayesian statistics provides a sound mathematical framework for handling them and appropriately quantifying the uncertainty about the parameters of interest that is induced by our lack of knowledge about the other unknown model parameters. Bayesian analysis seeks the posterior marginal distribution of the parameter or parameters of interest—that is, the distribution of those parameters conditional only on the observed data (not on any other unknown parameters). In the example of the normal model, the posterior marginal density of μ is p(μ |y). The general Bayesian approach is to obtain the joint posterior distribution of all unknown quantities in the model and then to integrate out the one(s) in which we are not interested. As our example, let’s reconsider Bayesian analysis of sample data drawn from a normal population, this time realistically admitting that we don’t know either the population mean or the population variance. In this case, we will need to specify a joint prior on both of the unknown parameters.

M.K. Cowles, Applied Bayesian Statistics: With R and OpenBUGS Examples, Springer Texts in Statistics 98, DOI 10.1007/978-1-4614-5696-4 7, © Springer Science+Business Media New York 2013

101

102

7 More Realism Please: Introduction to Multiparameter Models

7.1 Conventional Noninformative Prior for a Normal Likelihood with Both Mean and Variance Unknown Suppose we have no prior information or that we want our analysis to depend only on the current data. We need to construct a noninformative joint prior density for μ and σ 2 . The standard noninformative prior in this case arises by considering μ and σ 2 a priori independent. A priori independence may be a reasonable assumption here. It means that if we had prior knowledge about the center of the population distribution (μ ), that wouldn’t tell us anything about the spread of the population distribution (represented by σ 2 ), and conversely, prior information about the spread wouldn’t tell us anything about the center. Recall that, if two random variables are independent, then their joint density is simply the product of their individual marginal densities. Thus, the standard noninformative prior that we are seeking is simply the product of the standard noninformative priors for μ when σ 2 is assumed known and for σ 2 when μ is assumed known: p(μ , σ