Perception and the metaphysics of information
- PDF / 158,225 Bytes
- 2 Pages / 595.276 x 790.866 pts Page_size
- 66 Downloads / 198 Views
OUTLOOK
Perception and the metaphysics of information Jérémie Jozefowiez 1
# The Psychonomic Society, Inc. 2019
Summary Sims (Science, 360, 652-656, 2018) aimed to apply tools from information theory (rate-distortion theory) to perception. Here I provide an overview of rate-estimation theory and the way Sims applies to it to psychological issues before briefly discussing the implications of such applications. Keywords Perception . Information theory . Rate-estimation theory Though cognitivists view the mind as an informationprocessing device, applications of information theory to psychological issues are extremely rare. The target of this Outlook article (Sims, 2018) attempts to prove its relevance.
that in response to the unknown signal, a channel has emitted an output y ∈ Y. Let H(S| O) be the entropy of the source conditional upon the fact that you have been informed of the channel output. H ðSjOÞ ¼ − ∑ PðyÞ ∑ PðxjyÞlog½PðxjyÞ y∈Y
Rate-distortion theory and its application to psychology: A primer Fundamental concepts Sims’ approach (Sims, 2016 is a better introduction) relies on rate-distortion theory, which mixes information theory with decision theory. Imagine a source S emitting a signal x ∈ X.1 Let P(x) be the probability that a signal whose value is x is emitted. − log[P(x)] measures how surprising x is: it tends toward 0 when P(x) tends toward 1 (highly probable events are not surprising) and toward +∞ when P(x) tends toward 0 (highly improbable events are extremely surprising). The entropy H(S) of P(x) measures how surprising the signal is on average H ðS Þ ¼ − ∑ PðxÞlog½PðxÞ xϵX
Suppose that you know that a signal has been emitted but you do not know its value. On average, your level of surprise when informed of it should be H(S). Now suppose you are informed
xϵX
If there is any relation between the value of the signal and the output of the channel, knowing y should reduce your uncertainty regarding x. If so, when informed of the value of x, you should not be as surprised as if you had not been informed of the channel output: H(x) > H(x| y). The average amount of information the channel output brings you about x is I ðx; yÞ ¼ H ðxÞ−H ðxjyÞ
If x and y are independent of each other, H(x) = H(x| y), and I(x, y) = 0. The best-case scenario is when only one value of y corresponds to each possible value of x: knowing y is as good as knowing x. In that case, H(x| y) = 0 and so, I(x, y) = H(x). There is an upper limit C to the amount of information a channel can transmit. Hence, a channel with an information capacity C will not be able to perfectly transmit information about a source whose entropy H(x) is higher than C. All of these are the basic concepts of information theory. What rate distortion theory adds is the idea of a cost function, L(x, y), expressing a cost that incurred when the channel emits response y while the source has emitted signal x. The average cost E(L) is
1
For convenience, I will assume that X and Y are discrete sets, but the framework also applies to continuous variables.
E
Data Loading...