Dienstag, 14. August 2018


"Uncertainty has a technical meaning based on information theory (Shannon & Weaver, 1949). Information theory isn't a theory with a specific content or subject matter (Frick, 1959; Garner, 1962). It's probably better described as a perspective, a way of thinking about the nature and structure of information. As noted by Berlyne (1965): “A certain degree of uncertainty is said to exist when (1) any number of alternative events can occur, (2) there is no knowing in advance which will occur at a particular time, and (3) each alternative occurs with a specifiable relative frequency or probability”. Information theory specifies uncertainty as [formula] in which p is the probability that event i will occur (Attneave, 1959). According to this formula, uncertainty has some interesting properties. First, uncertainty increases as the number of alternatives increases, all else equal, because uncertainty is a sum across alternatives. For example, an election with five candidates is more uncertain than an election with two candidates. Second, uncertainty increases as the alternative events become equally probable. An election is more uncertain when all five candidates have an equal chance of winning and less uncertain when one candidate is the clear favorite. Combining these two properties, we see that uncertainty approaches a psychological maximum when a large number of alternatives are equally likely."

Paul Silvia, Exploring the Psychology of Interest

Keine Kommentare:

Kommentar veröffentlichen