Selecting a method and following it is like choosing a road and embarking on a journey (this indeed is the origin of the word: meta-, which is Greek for beyond, after, along with, and hodos, which is way, journey). The road may guide to the goal but at the same time impose "a priori limits on the terrain travelled and on the expected outcome. 'Methods' are fundamental to human cognition and action (definition 1; e.g. Newell and Simon 1972). Here we shall confine ourselves to scientific methods (included in definition 2, which is an elaboration of definition 1), and particularly, to methods used in the experimental sciences. Discussion of the methodology of science at large far exceeds the scope of this treatment (Nagel 1979; Bechtel 1988). Therefore, we shall identify only a few basic notions that are helpful in the discussion, classification, and evaluation of research.
Despite the fact that experimental science is wonderfully rich and heterogeneous, the number of elementary types of methods, or better 'metamethods', is modest indeed (Dudai 1993). The mythological muses of art and intellectual pursuit, daughters of Zeus and Mnemosyne ("mnemonics), were nine in number (Hesiod 8C bc); the practical muses of experimental science, daughters of logic, are fewer, but still as inspiring as the mythological ones. The core experimental methods are observation, intervention, and simulation. To these one should add complementary analytical methods, mostly comparison and correlation. The data that all these methods generate are further analysed and manipulated in various ways to yield heuristic hypotheses, conceptual frameworks, attitudes, and "paradigms, which temporarily govern mainstream knowledge in a given domain of science. If influential, these hypotheses, concepts, and paradigms come to sustain and cultivate recurrent rounds of observations, interventions, modelling, etc. A scientific career, a smashing discovery or an entire scientific discipline may start at any point along this never-ending methodological ritual. The aforementioned description is surely simplistic from the point of view of the history and the philosophy of science, but it does encapsulate the essence of the reality of scientific practice and "culture. The 'metamethods' are shared by different scientific disciplines. In addition, distinct disciplines have their own sets of domain-specific methods, which transform the general methods into specific procedures and techniques.
So let's encounter the metamethods, one by one. First come the core experimental methods.
1. Observation. This is the most fundamental of all the experimental methods, clearly preceding modern science. 'Observation' sometimes connotes non-controlled as opposed to controlled experimental situations (e.g. Freedman et al. 1998). This is not the intention here. Careful observation in the course of controlled experiments could yield highly valuable and sometimes ground-breaking discoveries. Unfortunately, the good old practice of observation, which requires ample patience, openness, and experience, tends nowadays to be neglected by too many hyper-active investigators in their rush for tenure ("scoopophobia). Heuristic classifications, longer-lived "taxonomies, and signal hypotheses may emanate from smart observations (Hodgkin et al. 1977; for a notable example, see Darwin 1871). However, to contribute usefully to a scientific discipline, observations must usually be followed by the additional methods of intervention and simulation, to test hypotheses and generate "models. (On a special type of 'observation', 'introspection', which occupied a prominent position in the methodology of the early days of memory research, see "behaviourism.)
2. Intervention. This is a very popular type of research method. It is adored in "reductive research programmes. The aim of interventional methods is to infer function from dysfunction or hyperfunction. It is hence the "classical generic type of scientific experiment that involves active interference with nature to see what will happen. In the brain sciences, the agents used include interference in perception or in behavioural "performance; perturbation of metabolic cascades (e.g. "intra-cellular signal transduction cascades), using drugs or mutations; perturbation of the electrical activity of neurons and neuronal circuits: and anatomical lesions. Concrete examples are to be found in many entries in this book; e.g. "amnesia, "consolidation, "neurogenet-ics, to cite merely a few. For selected methodological issues related to the use and misuse of interventional studies, see Bechtel 1982,Glassman 1978.
3. Simulation. This type of methods attempts to imitate or simulate natural phenomena, processes, or candidate mechanisms. This is done in order to verify assumptions concerning structure and function, test models, predict performance, and generate new hypotheses. Simulation experiments come in two flavours, experimental and theoretical. In the experimental approach, phenomena, processes, or mechanisms are imitated or simulated by discrete experimental manipulations in situ. Examples include substitution of a "percept by electrical activity in the brain (Loucas 1936; Romo et al. 2000); of a conditioned stimulus ("classical conditioning) by "long-term potentiation (LTP; Skelton et al. 1985); of other types of learned input by identified molecular agents (Acosta-Urquidi et al. 1984; Kaba et al. 1994); and of "consolidation by the activation of "CREB (Yin et al. 1995). Some of these manipulations could also be considered 'interventions' (see 2 above), but their aim is specifically to imitate or simulate candidate biological processes and mechanisms of learning in order to prove their postulated physiological role. The theoretical use of this approach includes various types of simulations that are aimed at imitating as well as testing the natural phenomenon or parts of it ("algorithm, "model). In the near future we should expect to see more and more attempts to mimic or simulate living organisms, including their learning and memory capabilities, by electronic devices, creating 'in silico' in addition to 'in vivo' "systems (Normile 1999; "enigma). Simulation experiments could also involve 'thought experiments' ('Gedanken experiments'); these are controlled speculations, in which the entire 'experimental' manipulation is carried out in the imagination of the experimenter rather than on the bench or in the field (Sorensen 1992). However, fruitful thought experiments may require a more robust theoretical infrastructure than currently available in the neuroscience of memory.
All the above experimental methods are complemented and augmented by several analytical methods (for a "classic treatment, see Mill 1884). Two types are comparison and correlation (additional ones, related to hypothesizing, exceed the scope of this brief pragmatic discussion).
4. Comparison of sets of observations of the same variable under different conditions (e.g. memory as a function of "synaptic activity, drug treatment, age, "context), could illuminate the workings of processes and mechanisms in the system. This requires selection of appropriate variables, units of quantification, experimental design (including controls), and statistics (e.g. Martin and Bateson 1993; Freedman et al. 1998; Kerlinger and Lee 2000). Quantification, which is essential for scientific comparisons but also for other methods, deserves a special comment. The introduction of quantifiable variables is usually taken to mark the transformation of a field of interest into a scientific discipline. Sometimes the ingenuity of the forefathers of a scientific discipline is not in identifying the important questions, but rather in identifying or devising the variables that could be quantified and used in order to address these questions. For example, Ebbinghaus (1885) made it possible to first quantify reproducibly the "capacity and the stability of human memory by introducing retention of nonsense syllables as a measured quantity (see also Jacobs 1887; for the first quantitative measures of animal memory, see in Boakes 1984; Gorfein and Hoffman 1987).1
5. Correlation. Here a natural or manipulated phenomenon is correlated, in time or space, with other phenomena in the same or another "level of analysis, in order to identify links among phenomena. In an important subtype of correlative experiments, and as part of reductive memory research programmes, behavioural phenomena are correlated with neuronal "plasticity. Selected examples are correlation of "fear conditioning with "amygdalar "LTP (Rogan et al. 1997), or correlation of learning with neurogenesis in the "hippocampus (Gould et al. 1999a).In such cases, the aim is to pinpoint cross-level mechanistic inter dependency. On the general problematics of the attempt to conclude causality from correlations, see Irzik (1996; also the pitfalls of post-hoc argumentation in "criteria). It is easy to notice the affinity of the above 'metameth-ods' to the "criteria used to assess the contribution of experimental data to the resolution of a given research problem. Observations correlate phenomena, identify similarity, and hint at necessity; interventions identify necessity; and simulations yield information on similarity, usefulness, sufficiency, and even exclusiveness. In general, whereas the methods provide us with the knowledge, the criteria tell us about the relevance of this knowledge to the question posed.
The field of memory research is equipped with its own special repertoire of methods. These are exemplified in "classical conditioning, "cue revaluation, "delay task, "fear conditioning, "habituation, "instrumental conditioning, "LTP, "maze, "priming, "real-life memory, "sensitization, "transfer, and "working memory. But brain research in general is in a special situation. It is a truly multidisciplinary enterprise. The more it advances, the more it is quick to incorporate knowledge and methods from a great variety of other disciplines. These range from molecular, cellular, and "developmental biology, via physiology and anatomy, clinical neurology and "neuroimaging, psychology, and ethology, to computational science and information theory (Dudai 1989; Martin and Bateson 1993; Baddeley 1997; Manning and Dawkins 1998; Zigmond et al. 1999; Kandel et al. 2000). Not surprisingly, a recent textbook in the neurosciences is authored by no less than 150
experts in different fields and subfields, methods, and techniques, while still leaving some important topics and issues untouched (Zigmond et al. 1999).
Selected associations: Control, Criterion, Reduction, Simple system
Ebbinghaus' method was very efficient and influential, but not without opponents. For a revolt against the use of nonsense material to test the faculties of human learning and memory, see Bartlett (1932; *classic), and *real-life memory.
Was this article helpful?
Have you ever been envious of people who seem to have no end of clever ideas, who are able to think quickly in any situation, or who seem to have flawless memories? Could it be that they're just born smarter or quicker than the rest of us? Or are there some secrets that they might know that we don't?