The upper limit of this ability

Pondering the capacity of our memory carries with it the risk of being enslaved to the common "metaphor of memory as a static storehouse (Roediger 1980). This misconception should be avoided at the outset. Furthermore, in the case of the nervous system, even the definition itself evokes cardinal issues: What is the meaning of'store' (definition 1)? Are "internal representations stored as such, reactivated, or reconstructed anew each time they are "retrieved?1 If memory is reconstructed, then the capacity of the system should involve the ability to decompress and recreate information; however, something must eventually be stored as, clearly, the brain does not reconstruct memories from void. And as if all this is not enough, it is likely that different memory systems encode information in different ways, possess different capacities, and exploit the capacity to different extents. Having said all this, it is still of interest to wonder whether in terms of capacity (definition 3), our brain is any match to a notebook computer.

The data are still scarce. The Swiss-German physiologist Haller, who in the eighteenth century performed the first documented experiments on the timing of psychic processes, reached the conclusion that a third of a second is sufficient time for the production of one idea. Hence assuming only eight mentally useful hours per day (!), in 50 years a person has a chance to collect up to 1577880000 traces (Burnham 1889). More recent (yet not necessarily less controversial) estimates of how much information we perceive during an average lifetime, yield the very wide range of a1013-a1017 bits (reviewed in Dudai 1997a; 'bit' is the basic unit in information theory; see "system). In considering the information that becomes available to the brain, we must take into account not only the information that is obtained from the external world, but also that information that is generated endoge-nously by the brain ("a priori, "internal representation, "stimulus). We do not yet have the bases to estimate the magnitude of contribution of this type of information to the potential representational pool of the brain. "Modelling of artificial 'neuronal' networks of the estimated size of the human brain yields an upper representational capacity of a1013 (Palm 1982) to a1015 bits (Amit 1989). There have been also attempts to estimate the representational capacity of parts of the brain, such as "cortex (Gochin et al. 1994; Rolls et al. 1997). The conclusion was that the available representational capacity is probably more than required to subserve our actual mental and behavioural repertoire.

But how much of this information could be stored in our memory over time? Some agreement exists only on the maximal capacity of short-term, or better, "working memory ("phase). The discussion digresses here from the bits of the formal models to vague, almost impressionistic units. The most popular estimate is that our working memory can hold only seven-plus-minus-two chunks of information at one time. This estimate stems from experiments in psychology (Jacobs 1887; Miller 1956) and from observations in anthropology (Wallace 1961; Berlin 1992).2 Despite the catchy title of Miller's classic article, seven-plus-minus-two is not a sacred number. There are lower estimates as well (down to only three separate registers; Broadbent 1975). Miller's idea was not to determine a precise value, but rather to point out that the brain is an information processing system of limited capacity, which had evolved to recode information into chunks in order to be able to deal with it efficiently (Baddeley 1994; Shiffrin and Nosofsky 1994). Attempts have been made to estimate the size of a chunk in terms of digits, syllables, words, and patterns (Simon 1974). Some individuals develop a remarkable "skill for chunking, and by combining it with efficient "retrieval from long-term stores, can handle huge amounts of information simultaneously (e.g. more than a 10-fold increase in the normal digit span; Chase and Ericsson 1982).

It has been estimated by Simon (1974), on the basis of the contemporary psychological literature, that 5-10 s are needed to transfer a chunk from short- into long-term stores. When it comes to both the maximal and the actual capacity of the latter, the issue of magnitude becomes even more evasive. In what units should long-term memory be measured? Which 'chunks' should be used to estimate the size of, say, an "episodic scene or a motor skill? Furthermore, how can one compare the capacity of different long-term memory systems? A variety of experimental methods have been deployed, ranging from introspection (Galton 1879), via controlled recalling of personal experience (Wagenaar 1986), to measurement of "real-life capabilities such as picture "recognition, language, or the feats of "mnemonists (Table 1). There are no definite answers, only estimates expressed in ad-hoc, somewhat fuzzy units. A conservative estimate is that a normal human long-term memory retains a105-106 items, where item means a word, a fact, an autobiographical episode—what might intuitively be

Table 1 Estimates of the actual capacity of selected human long-term memory stores

Store

Size

Reference

Words in language

25 000-50000

Nagyand

(mother tongue)

Anderson (1984)

Pictures recognized

> 10000

Standing (1973)

Game patterns by

10000-100000

Chase and Simon

a chess master

(1973)

Facts by mnemonists

100000

Yates (1966)

Core personal episodes

Thousands

Dudai (1997a)

Items in expert databases

500-2000

Levi-Strauss (1966);

in orally-reliant societies

Berlin (1992)

called a unit of memory, but formally is very unsatisfactory indeed (Dudai 1997).

The capacity of brains and memory systems is no doubt of interest, but it would do no harm to scrutinize the assumptions that underlie this interest. One assumption, which is definitely wrong as a "generalization, is that the bigger, the better. The capacity of memory systems is the outcome of the interplay among multiple drives and elements. These include the functions that this memory system is supposed to accomplish; the mechanistic constraints imposed by the biological machinery; the feasibility of "algorithms; the energy resources that are required to "develop and operate the system; and, finally, the current stage in the evolution of the system. Here is but one concrete example: is it phylogenetically advantageous for the system of "declarative, autobiographical memory to have a large capacity? Not necessarily (see in "false memory).

It would be naive to expect real advances in the estimation of memory capacity before two developments materialize. First, we must decipher the codes of internal representations, in order to be better equipped to estimate the requirements for representational and computational space in the brain. Second, we must gain a much better understanding into the processes and mechanisms of "persistence, "forgetting, relearning in "extinction, and particularly, retrieval of memory. Retrieval that tolerates liberal reconstructions of internal representations, and is heavily dependent on online information, is expected to place different demands on capacity than retrieval that involves faithful reactivation of fine-grained stored information. The issue of capacity is hence intimately associated with some of the most profound "enigmas of memory research.

Selected associations: Episodic memory, Internal representation, Persistence, Working memory

'This issue is further discussed in *persistence. 2By the way, the working-memory capacity of the chimpanzee is not much less: > 5 items, the same as preschool children (Kawai and Matsuzawa 2000).

0 0

Post a comment