Job-Shop Scheduling Problem
- PDF / 909,657 Bytes
- 10 Pages / 594 x 828 pts Page_size
- 31 Downloads / 237 Views
ENTROPY
PRINCI-
C.E. Shannon's seminal discovery [7] (1948) of his entropy measure in connection with communication theory has found useful applications in several other probabilistic systems. E.T. Jaynes has further extended its scope by discovering the maximum entropy principle (Maxnnt) [1] (1957) which is inherent in the process of optimization of the entropy measure when some incomplete information is given about a system in the form of moment constraints. MaxEnt has, over the past four decades, given rise to an interdisciplinary methodology for the formulation and solution of a large class of probabilistic systems. Furthermore, MaxEnt's natural kinship with the Bayesian methods of analyses has further bolstered its importance as a viable tool for statistical inference. E n t r o p y a n d U n c e r t a i n t y . The word entropy first originated in the discipline of thermodynamics, but Shannon entropy has a much broader meaning since it deals with the more pervasive concept of information. The word entropy itself has now crept into common usage to mean transformation of a quantity, or phenomenon, from order to disorder. This implies an irreversible rise in uncertainty. In fact, the word uncertainty would have been more unambiguous as to its intended meaning in the context of information theory, but for historic reasons, the usage of the word entropy has come to stay in the literature. Uncertainty arises both in probabilistic phenomena such as in the tossing of a coin and, equally well, in deterministic phenomena where we know
that the outcome is not a chance event, but we are merely fuzzy about the possibility of the specific outcome. What is germane to our study of MaxEnt is only probabilistic uncertainty. The concept of probability that is used in this context is what is generally known as the subjective interpretation as distinct from the objective interpretation based on frequency of outcome of an event. The subjective notion of probability considers a probability distribution as representing a state of knowledge and hence it is observer dependant. The underlying basis for an initial probability assignment is given by the Laplace's principle of insufficient reason. According to this, if in an experiment with n possible outcomes, we have no information except that each probability pi >_ 0 and ~-~in=lPi - 1, then the most unbiased choice is the uniform distribution: ( 1 / n , . . . , 1/n). Laplace's principle underscores the choice of maximum uncertainty based on logical reasoning only. W h y C h o o s e M a x i m u m U n c e r t a i n t y ? We shall now consider the example of a die in order to highlight the importance of maximum uncertainty as a preamble to our later discussion of MaxEnt. When the only information available is that the die has six faces, the uniform probability distribution ( 1 / 6 , . . . , 1/6), satisfying the natural constraint n
1,
>_ 0,...,p6 _> 0,
(1)
i--1
represents the maximum uncertainty. If, in addition, we are also given the mean number of points on the die, that is, if
Data Loading...