Cognitive Psychology and Cognitive Neuroscience/Present and Future of Research
||A Wikibookian has nominated this page for cleanup.
You can help make it better. Please review any relevant discussion.
|Previous Chapter||Overview||Next Chapter|
"It's hard to make predictions - especially about the future." Robert Storm Petersen
- 1 Introduction / Until now
- 2 Today's approaches
- 3 Future Research
- 4 Conclusion
- 5 References
- 6 Links
Introduction / Until now
Developing from the information processing approach, present cognitive psychology differs from classical psychological approaches in the methods used as well as in the interdisciplinary connections to other sciences. Apart from rejecting introspection as a valid method to analyse mental phenomena, cognitive psychology introduces further, mainly computer-based, techniques which have not been in the range of classical psychology by now.
By using brain-imaging-techniques like fMRI, cognitive psychology is able to analyse the relation between the physiology of the brain and mental processes. In the future cognitive psychology will concentrate on computer-related methods even more than it is already. Hereby it will profit from improvements in the area of IT. E.g. fMRI scans nowadays still have lots of possible error sources, which should be solved in the future. Thereby the technique becomes more powerful and precise. In addition to that the computational approach can be combined with the classical behavioural approach, where one infers a participant's mental states from the behaviour that is shown.
Cognitive psychology however is not only using methods developed by other sciences, of course it collaborates with topic-related sciences like artificial intelligence, neuroscience, linguistics and the philosophy of mind as well. The advantage is clear: different perspectives on the topic make it possible to confirm results from a field or to eventually gain new accesses to the study of the mind. Modern studies of cognitive psychology more and more criticise the classical information processing approach, which leaves room for other approaches to acquire more importance E.g. the classical approach is modified to a parallel information processing approach, which is thought to be closer to the actual functioning of the brain.
The momentary usage of brain imaging
How are the known brain imaging methods used? What kind of information can be derived using this methods?
fMRI is an non-invasive imaging method that pictures active structures of the brain in a high spatial resolution. For that the participant has to lie in a tube and his brain is pictured. While doing a task active structures in the brain of the participant can be recognised on the recordings.
If parts of the brain are active, the metabolism is also stimulated. The blood, that has an important function in the metabolic transport is flowing to the active nerve cells. The haemoglobin in the red blood cells carries oxygen (oxyhaemogliobin) when flowing to the part that is active and that needs oxygen, to consume and work. With consumption the haemoglobin „delivers“ the oxygen (desoxyhaemoglobin). This leads to local changes in the relative concentration of oxyhemoglobin and desoxyhemoglobin and changes in local blood volume and the blood flow. While haemoglobin is oxygenated it is diamagnetic (what means that the material tends to leave the magnetic field), but paramagnetic (what is the opposite of diamagnetic; the material tends to migrate into the magnetic field) while desoxygenated. The magnetic resonance signal of blood is therefore slightly different depending on the level of oxygenation.
By being able to detect the magnetic properties mentioned above, the fMRI-scanner is able to determine alterations in blood flow and blood volume, and constructing a picture. This picture shows the brain and its activated parts. While the participant is doing a task the researcher can derive, which brain regions are involved. But that is indirect measured data, because in a way the metabolism is measured and not the neuronal activity. Furthermore this imaging method has as a consequence of the principle a low temporal resolution.
The Electroencephalogram (EEG) is another non-invasive brain imaging method. Electronic signals from the human brain are recorded while the participant is doing a task. The electronic activity of the neuronal cells, that is adding can be measured.
The electronic activity is measured by attaching electrodes to the skin of the head. In most cases the electrodes are installed on a cap, that the participant wears. It is very time-consuming to install the cap correct on the head of the participant, but it is very important for the outcome, that everything is in the right place. To assure the adding of the signals the electrodes have to be installed geometric and in a parallel configuration. This technique is applied to measure the event-related potential (ERP), potential changes. They are correlated temporal to an emotional, sensoric, cognitive or motoric event. In the experiment a certain event has to be repeated again and again. The type ERP then can be extracted and calculated. This method is not only time-consumptive, also a lot of disrupting factors complicate the measuring. Moreover this method has a very high temporal resolution, but a very low spatial resolution. It is hardly possible to measure activity in deeper brain regions or to detect the source of the activity interpreting only the recordings.
Cognitive science is multidisciplinary science. It comprises areas of cognitive psychology, linguistics, neuroscience, artificial intelligence, cognitive anthropology, computer science and philosophy. Cognitive science concentrates to study the intelligent behaviour of humans, which includes perception, learning, memory, thought and language. Research in cognitive sciences are based on naturalistic research methods such as cognitive neuropsychology, introspection, psychological experimentation, mathematical modelling and philosophical argumentation.
In the beginning of the cognitive sciences the most common method was introspection. It meant that the test subject evaluated his or her own cognitive thinking. In these experiments the researchers were using experienced subjects because they had to analyse and report their own cognitive thinking. Problems can occur when the results are interpreted and the subject has different reports from the same action. Obviously a clear separation is needed between the matters that can be studied by introspection and the ones that are not adequate for this method.
Computational modelling in cognitive science means that the mind is seen as a machine. This approach seeks to express theoretical ideas through computational modelling that generate behaviour similar to humans. Mathematical modelling is based on flow charts. The model's quality is very important to ensure the equivalence of the input and results.
Nowadays the researchers in cognitive sciences use often theoretical and computational models. "This does not exclude their primary method of experimentation with human participants. In cognitive sciences it is also important to bring the theories and the experimenting together. Because it comprises so many fields of science it is important to bring together the most appropriate methods from all these fields. The psychological experiments should be interpreted through a theory that expresses mental representations and procedures. The most productive and revealing way to perform research in cognitive sciences is to combine different approaches and methods together. This ensures overall picture from the research area and it comprises the viewpoints of all the different fields." (Thagard, Cognitive Science) Nevertheless Cognitive Science has not yet managed to succeed in bringing the different areas together. Nowadays it is criticised for not establishing a science on its own. Rather few scientist really address themselves as cognitive scientists. Furthermore the basic metaphor of the brain functioning like a computer is challenged as well as the distinctions between their models and nature (cf. Eysenck & Keane, Cognitive Psychology, pp. 519-520). This of course brings up a lot of work for the future. Cognitive Science has to work on better models that explain natural processes and that are reliably able to make predictions. Furthermore these models have to combine multiple mental phenomena. In addition to that a general "methodology for relating a computational model's behaviour to human behaviour" has to be worked out. Hereby the strength of such models can be increased. Apart from that Cognitive Science needs to establish an identity with prominent researchers that avow themselves to Cognitive Science. And finally its biggest goal, the creation of a general unifying theory of human cognition (see Theory Part), has to be reached (cf. ibid, p. 520).
Experimental Cognitive Psychology
Psychological experimentation studies mental functions. This is done with indirect methods meaning reasoning. These studies are performed to find causal relations and the factors influencing behaviour. The researcher observes visible actions and makes conclusions based on these observations. Variables are changed one at a time and the effect of this change is being observed. The benefits of experimental researching are that the manipulated factors can be altered in nearly any way the researcher wants. From this point it is finally possible to find causal relations.
In being the classical approach within the field of Cognitive Psychology, experimental studies have been the basis for the development of numerous modern approaches within contemporary Cognitive Psychology. It's empirical methods have been developed and verified over time and the gained results were a foundation for many enhancements contributed to the field of psychology.
Taking into consideration the established character of experimental cognitive psychology, one might think that methodological changes are rather negligible. But recent years came up with a discussion concerning the question, whether the results of experimental CP remain valid in the “real world” at all. A major objection is the fact that the artificial environment in an experiment might cause that certain facts and coherences are unintentionally ignored, which is due to the fact that for reasons of clarity numerous factors are suppressed. (cf. Eysenck & Keane, Cognitive Psychology, pp.514-515). A possible example for this is the research concerning attention. Since the attention of the participant is mainly governed by the experimenter’s instructions, it’s focus is basically determined. Therefore "relatively little is known of the factors that normally influence the focus of attention." (ibid, p.514) Furthermore it turns out to be problematic that mental phenomena are often examined in isolation. While trying to make the experimental setup as concise as possible (in order to get clearly interpretable results) one decouples the aspect at issue from adjacent and interacting mental processes. This leads to the problem that the results turn out to be valid in the idealised experimental setting only but not in “real life”. Here multiple mental phenomena interact with each other and numerous outer stimuli influence the behaviour of mental processes. The validity gained by such studies could only be characterised as an internal validity (which means that the results are valid in the special circumstances created by the experimenter) but not as an external validity (which means that the results stay valid in changed and more realistic circumstances) (cf. ibid, p.514). These objections lead to experiments which have been developed to refer closer to "real life". According to these experiments "real-world" phenomena like 'absent-mindedness', 'everyday memory' or 'reading' gain importance. Nevertheless the discussion remains whether such experiments really deliver new information about mental processes. And whether these 'everyday phenomenon studies' really become broadly accepted greatly depends on the results current experiments will deliver.
Another issue concerning experimental setups in cognitive psychology is the way individual differences are handled. In general the results from an experiment are generated by an analysis of variance. This causes that results which are due to individual differences are averaged out and not taken into further consideration. Such a procedure seems to be highly questionable, especially if put into the context of an investigation of Bowers in 1973, which showed that over 30% of the variance in such studies are due to individual differences or their interaction with the current situation (cf. ibid, p.515). Based on such facts one challenge for future experimental cognitive psychology is the analysis of individual differences and finding way to include knowledge about such differences in general studies.
Another approach towards a better understanding of human cognition is cognitive neuroscience. Cognitive neuroscience lies at the interface between traditional cognitive psychology and the brain sciences. It is a science whose approach is characterised by attempts to derive cognitive level theories from various types of information, such as computational properties of neural circuits, patterns of behavioural damage as a result of brain injury or measurements of brain activity during the execution of cognitive tasks (cf. www.psy.cmu.edu). Cognitive neuroscience helps to understand how the human brain supports thought, perception, affection, action, social process and other aspects of cognition and behaviour, including how such processes develop and change in the brain over time (cf. www.nsf.gov).
Cognitive neuroscience has emerged in the last decade as an intensely active and influential discipline, forged from interactions among the cognitive sciences, neurology, neuroimaging, physiology, neuroscience, psychiatry, and other fields. New methods for non-invasive functional neuroimaging of subjects performing psychological tasks have been of particular importance for this discipline. Non-invasive functional neuroimaging includes: positron emission tomography (PET), functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), optical imaging (near infra-red spectroscopy or NIRS), anatomical MRI, and diffusion tensor imaging (DTI) The findings of cognitive neuroscience are directed towards enabling a basic scientific understanding of a broad range of issues involving the brain, cognition and behaviour. (cf. www.nsf.gov).
Cognitive neuroscience becomes a very important approach to understand human cognition, since results can clarify functional brain organisation, such as the operations performed by a particular brain area and the system of distributed, discrete neural areas supporting a specific cognitive representation. These findings can reveal the effect on brain organization of individual differences (including even genetic variation) (cf. www.psy.cmu.edu, www.nsf.gov). Another importance of cognitive neuroscience is that cognitive neuroscience provides some ways that allow us to "obtain detailed information about the brain structures involved in different kinds of cognitive processing" (Eysenck & Keane, Cognitive Psychology, p. 521). Techniques such as MRI and CAT scans have proved of particular value when used on patients to discover which brain areas are damaged. Before non-invasive methods of cognitive neuroscience were developed localisation of "brain damage could only be established by post mortem examination" (ibid). Knowing which brain areas are related to which cognitive process would surely lead to obtain a clearer view of brain region, hence, in the end would help for a better understanding of human cognition processes. Another strength of cognitive neuroscience is that it serves as a tool to demonstrate the reality of theoretical distinctions. For example, it has been argued by many theorists that implicit memory can be divided into perceptual and conceptual implicit memory; support for that view has come from PET studies, which show that perceptual and conceptual priming tasks affected different areas of the brain (cf. ibid, pp. 521-522). However, cognitive neuroscience is not that perfect to be able to stand alone and answer all questions dealing with human cognition. Cognitive neuroscience has some limitations, dealing with data collecting and data validity. In most neuroimaging studies, data is collected from several individuals and then averaged. Some concern has arose about such averaging because of the existence of significant individual differences. Though the problem was answered by Raichle (1998), who stated that the differ in individual brain should be appreciated, however general organising principles emerge that transcend these differences, a broadly accepted solution to the problem has yet to be found (cf. ibid, p. 522).
Cognitive Neuropsychology maps the connection between brain functions and cognitive behaviour. Patients with brain damages have been the most important source of research in neuropsychology. Neuropsychology also examines dissociation (“forgetting”), double dissociation and associations (connection between two things formed by cognition). Neuropsychology uses technological research methods to create images of the brain functioning. There are many differences in techniques to scan the brain. The most common ones are EEG (Electroencephalography), MRI and fMRI (functional Magnetic Resonance Imaging) and PET (Positron Emission Tomography).
Cognitive Neuropsychology became very popular since it delivers good evidence. Theories developed for normal individuals can be verified by patients with brain damages. Apart from that new theories could have been established because of the results of neuropsychological experiments. Nevertheless certain limitations to the approach as it is today cannot be let out of consideration. First of all the fact that people having the same mental disability often do not have the same lesion needs to be pointed out (cf. ibid, pp.516-517). In such cases the researchers have to be careful with their interpretation. In general it could only be concluded that all the areas that the patients have injured could play a role in the mental phenomenon. But not which part really is decisive. Based on that future experiments in this area tend to make experiments with a rather small number of people with pretty similar lesion respectively compare the results from groups with similar syndromes and different lesions. In addition to that the situation often turns out to be vice versa. Some patients do have pretty similar lesions but show rather different behaviour (cf. ibid, p.517). One probable reason therefore is that the patients differ in their age and lifestyle (cf. Banich, Neuropsychology, p.55). With better technologies in the future one will be better able to distinguish the cases in which really the various personalities make the difference or in which cases the lesions are not entirely equal. In addition to that the individual brain structures which may cause the different reactions to the lesions will become a focus of research. Another problem for Cognitive Neuropsychology is that their patients are rare. The patients which are interesting for such research have lesions of an accident or suffered during war. But in addition there are differences in the manner of the lesion. Often multiple brain regions are damaged which makes it very hard to determine which of them is responsible for the examined phenomenon. The dependency on chance whether there are available patients will remain in future. Thereby predictions concerning this aspect of the research are not very reliable. Apart from that it is not possible yet to localise some mental processes in the brain. Creative thought or organisational planning are examples (cf. Eysenck & Keane, Cognitive Psychology, p.517). A possible outcome of the research is that those activities rely on parallel processing. This would support the idea of the modification of the information processing theory that will be discussed later on. But if it shows up that a lot of mental processes depend on such parallel processing it would turn out to be a big drawback for Cognitive Psychology since its core is the modularization of the brain and the according phenomena. In this context the risk of overestimation and underestimation has to be mentioned. The latter occurs because Cognitive Psychology often only identifies the most important brain region for the mental task. Other regions that are related thereto could be ignored. This could turn out to be fundamental if really parallel processing is crucial to many mental activities. Overestimation occurs when fibers that only pass the damaged brain region are lesioned, too. The researcher concludes that the respective brain region plays an important role in the phenomenon he analyses even though only the deliverance of the information passed that region (cf. ibid). Modern technologies and experiments here have to be developed in order to provide valid and precise results.
A unified theory of cognitive science serves the purpose to bring together all the vantage points one can take toward the brain/mind. If a theory could be formed which incorporates all the discoveries of the disciplines mentioned above a full understanding would be tangible.
ACT-R is a Cognitive Architecture, an acronym for Adaptive Control of Thought–Rational. It provides tools which enable us to model the human cognition. It consists mainly of five components: Perceptual-motor modules, declarative memory, procedural memory, chunks and buffers. The declarative memory stores facts in “knowledge-units”, the chunks. These are transmitted through the modules respective buffers, which contain one chunk at a time. The procedural memory is the only one without an own buffer, but is able to access the contents of the other buffers. For example those of the perceptual-motor modules, which are the interface with the (simulated) outer world. Production is accomplished by predefined rules, written is LISP. The main character behind it is John R. Anderson who tributes the inspiration to Allan Newell.
SOAR is another Cognitive Architecture, an acronym for State, Operator And Result. It enables one to model complex human capabilities. Its goal is to create an agent with human-like behaviour. The working principles are the following: Problem-solving is a search in a problem-space. Permanent Knowledge is represented by production rules in the production memory. Temporary Knowledge is represented by objects in the working memory. New Goals are created only if a dead end is reached. The learning mechanism is Chunking. Chunking: If SOAR encounters an impasse and is unable to resolve it with the usual technique, it uses “weaker” strategies to circumvent the dead end. In case one of these attempts leads to success, the respective route is saved as a new rule, a chunk, preventing the impasse to occur again. SOAR was created by John Laird, Allen Newell and Paul Rosenbloom.
There are two types of neural networks: biological and artificial.
A biological NN consists of neurons which are physically or functionally connected with each other. Since each neuron can connect to multiple other neurons the number of possible connections is exponentially high. The connections between neurons are called synapses. Signalling along these synapses happens via electrical signalling or chemical signalling, which induces electrical signals. The chemical signalling works by various neurotransmitters.
Artificial NN are divided by their goals. One is that of artificial intelligence and the other cognitive modelling. Cognitive modelling NN try to simulate biological NN in order to gain better understanding of them, for example the brain. Until now the complexity of the brain and similar structures has prevented a complete model from being devised, so the cognitive modelling focuses on smaller parts like specific brain regions. NNs in artificial intelligence are used to solve distinct problems. But though their goals differ the methods applied are very similar. An artificial NN consist of artificial neurons (nodes) which are connected by mathematical functions. These functions can be of other functions which in turn can be of yet other functions and so on. The actual work is done by following the connections according to their weights. Weights are properties of the connections defining the probability of the specific route to be taken by the program and can be changed by it, thus optimizing the main function. Hereby it is possible to solve problems for which it is impossible to write a function “by hand”.
Brain imaging/activity measuring
As described in section 2.1. and 2.2. there are disadvantages of the brain imaging methods. fMRI has a low temporal resolution, but EEG a low spatial resolution. An interdisciplinary attempt is to combine both methods, to reach both a high spatial and temporal resolution. This technique (simultaneous EEG-measuring in the fMR) is used for instance in studying children with extratemporal epilepsy. It is important to assign the temporal progress to a region in which the epileptic seizure has its roots. In December of 2006 a conference in Munich discussed another idea of this mixture of methods: the study of Alzheimer's disease. It could be possible to recognise this disease very early. This could lead to new therapies to reduce the speed and the amount of cell-dead. In December of 2006 a conference in Munich discussed this eventuality. Brain imaging methods are not only useful in medical approaches. Other disciplines could benefit from the brain imaging methods and derive new conclusions. For instance for social psychologist the brain imaging methods are interesting. Experiments with psychopathic personalities are only one possibility to explore the behaviour of humans. For literature scientists there could be a possibility to study stylistic devices and their effect of humans while reading a poem. Another attempt in future research is to synchronise the direction of sight and the stimuli, that was trigger for the change of direction. This complex project needs data from eye-tracking experiments and data from fMRI-studies.
Unifying theories more unifying.
Since the mind is a single system it should be possible to explain it as such without having to take different perspectives for every approach (neurological,psychological,computational). Having such a theory would enable us to understand our brain far more thorough than now, and might eventually lead an everyday application. But until now there is no working Unifying Theory of Cognition, which fulfils the requirements stated by Allen Newell in his book Unified Theories of Cognition. Accordingly a UTC has to explain: How intelligent organisms respond flexibly to the environment. How they exhibit goal-directed behaviour and choose goals rationally (and in response to interrupts: see previous point). How they use symbols. How they learn from experience. Even Newells own implementation SOAR does not reach these goals.
Here I collected the abstracts of a few recent findings, feel free to modify or add to them.
>>Unintentional language switch  Kho, K.H., Duffau, H., Gatignol, P., Leijten, F.S.S., Ramsey, N.F., van Rijen, P.C. & Rutten, G-J.M. (2007) Utrecht Abstract 
We present two bilingual patients without language disorders in whom involuntary language switching was induced. The first patient switched from Dutch to English during a left-sided amobarbital Wada-test. Functional magnetic resonance imaging yielded a predominantly left-sided language distribution similar for both languages. The second patient switched from French to Chinese during intraoperative electrocortical stimulation of the left inferior frontal gyrus. We conclude that the observed language switching in both cases was not likely the result of a selective inhibition of one language, but the result of a temporary disruption of brain areas that are involved in language switching. These data complement the few lesion studies on (involuntary or unintentional) language switching, and add to the functional neuroimaging studies of switching, monitoring, and controlling the language in use.
>>Bilateral eye movement -> memory Parker, A. & Dagnall, N. (2007) Manchester Metropolitan University, One hundred and two participants listened to 150 words, organised into ten themes (e.g. types of vehicle), read by a male voice. Next, 34 of these participants moved their eyes left and right in time with a horizontal target for thirty seconds (saccadic eye movements); 34 participants moved their eyes up and down in time with a vertical target; the remaining participants stared straight ahead, focussed on a stationary target. After the eye movements, all the participants listened to a mixture of words: 40 they'd heard before, 40 completely unrelated new words, and 10 words that were new but which matched one of the original themes. In each case the participants had to say which words they'd heard before, and which were new. The participants who'd performed sideways eye movements performed better in all respects than the others: they correctly recognised more of the old words as old, and more of the new words as new. Crucially, they were fooled less often by the new words whose meaning matched one of the original themes - that is they correctly recognised more of them as new. This is important because mistakenly identifying one of these 'lures' as an old word is taken as a laboratory measure of false memory. The performance of the participants who moved their eyes vertically, or who stared ahead, did not differ from each other. Episodic memory improvement induced by bilateral eye movements is hypothesized to reflect enhanced interhemispheric interaction, which is associated with superior episodic memory (S. D. Christman & R. E. Propper. 2001). Implications for neuropsychological mechanisms underlying eye movement desensitization and reprocessing (F. Shapiro, 1989, 2001), a therapeutic technique for posttraumatic stress disorder, are discussed
>>is the job satisfaction–job performance relationship spurious? A meta-analytic examination
Nathan A. Bowling(Department of Psychology, Wright State University) Abstract 
The job satisfaction–job performance relationship has attracted much attention throughout the history of industrial and organizational psychology. Many researchers and most lay people believe that a causal relationship exists between satisfaction and performance. In the current study, however, analyses using meta-analytic data suggested that the satisfaction–performance relationship is largely spurious. More specifically, the satisfaction–performance relationship was partially eliminated after controlling for either general personality traits (e.g., Five Factor Model traits and core self-evaluations) or for work locus of control and was almost completely eliminated after controlling for organization-based self-esteem. The practical and theoretical implications of these findings are discussed.
>>Mirror-touch synesthesia is linked with empathy
Michael J Banissy & Jamie Ward (Department of Psychology, University College London)
Abstract  Watching another person being touched activates a similar neural circuit to actual touch and, for some people with 'mirror-touch' synesthesia, can produce a felt tactile sensation on their own body. In this study, we provide evidence for the existence of this type of synesthesia and show that it correlates with heightened empathic ability. This is consistent with the notion that we empathize with others through a process of simulation.
Where are the limitations of research? Can we rely on our intuitive idea of our mind? What impact could a complete understanding of the brain have on everyday life?
Brain activity as a false friend
In several experiments the outcome is not unambiguous. This hinders a direct derivation from the data. In experiments with psychopathic personalities researchers had to weaken their thesis, that persons with missing activity in the frontal lobe are predetermined for being violent psychopathic people, that are unethical murderers. Missing activity in the frontal lobe leads to a disregulation of threshold for emotional, impulsive or violent actions. But this also advantages for example fire fighters or policemen, who have to withstand strong pressures and who need a higher threshold. So missing activity is not a sufficient criterion for psychopathic personalities.
Today's work in the field of Cognitive Psychology gives several hints how future work in this area may look like. In practical applications improvements will probably mainly be driven by the limitations one faces today. Here in particular the newer subfields of Cognitive Psychology will develop quickly. How such changes look like heavily depends on the character of future developments in technology. Especially improvements in Cognitive Neuropsychology and Cognitive Neuroscience depend on the advancements of the imaging techniques. In addition to that the theoretical framework of the field will be influenced by such developments. The parallel processing theory may still be modified according to new insights in computer science. Thereby or eventually by the acceptance of one of the already existing overarching theories the theoretical basis for the current research could be reunified. But if it takes another 30 years to fulfil Newell's dream of such a theory or if it will happen rather quick is still open. As a rather young science Cognitive Psychology still is subject to elementary changes. All its practical and theoretical domains are steadily modified. Whether the trends mentioned in this chapter are just dead ends or will cause a revolution of the field could only be predicted which definitely is hard.
Anderson, John R., Lebiere, Christian, The Atomic Components of Thought, Lawrence Erlbaum Associates, 1998
Banich, Marie T., Neuropsycology - The Neural Bases of Mental Function, Hougthon Mifflin Company, 1997
E. Br. Goldstein, Cognitive Psychology, Wadsworth, 2004
Lyon, G.Reid, Rumsey, Judith M.: Neuroimaging. A Window to Neurological Foundations of Learning and Behaviour in Children. Baltimore. 1996.
M. W. Eysenck, M. T. Keane, Cognitive Psychology - A Student's Handbook, Psychology Press Ltd, 2000
Thagard, Paul, Cognitive Science in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy, 2004