COGENT Publications

This page contains abstracts of, and pointers to, a number of COGENT-related publications. First and foremost, there is the LEA text on cognitive modelling that uses COGENT:
In addition to the book, there are a number of relevant papers, grouped here under four headings:

Papers that describe COGENT

COGENT: A visual design environment for cognitive modeling

Abstract: COGENT is a design environment for modeling cognitive processes and systems. It permits psychologists to construct and test information-processing models of cognition based on traditional box/arrow diagrams. COGENT provides a graphical model editor together with a set of standard types of cognitive module based on familiar theoretical constructs from psychological theory. Models are constructed by selecting appropriate box types, connecting them with appropriate communication links, and configuring the various boxes according to the requirements of the investigator. Once a model has been constructed it may be executed to examine and analyze its behavior.

Availability: PDF Preprint.

Full reference: Cooper, R. & Fox, J. (1998). COGENT: A visual design environment for cognitive modeling. Behavior Research Methods, Instruments & Computers, 30, 553-564.


COGENT: An environment for the development of cognitive models

Abstract: COGENT is a modelling environment which aims to improve the methodology of computational modelling by providing an integrated approach to model development, description, and evaluation. The environment is explicitly designed to allow psychologists with a range of computational expertise (from very little to a great deal) to develop their own computational models. It achieves these aims by providing the user with a graphical programming language based on the familiar box/arrow notation of informal cognitive psychology. This language draws on parallels between the psychological concept of functional modularity and the computational concept of object-orientation to provide a sound modelling tool in which models may be evaluated through methodologically rigorous computational experiments. COGENT has been used by a number of researchers to develop models in the domains of memory, reasoning, decision making, problem solving, the performance of complex tasks, and concept combination. This chapter presents an overview of COGENT, drawing upon two models to illustrate the system. The first, a production system model of multi-column addition, demonstrates many of the fundamental features of the system. The second, a model of Allen inferencing, uses the new "analogue buffer" features of the system to reimplement the metrical algorithm described by Berendt (1996).

Availability: PDF.

Full reference: Cooper, R., Yule, P., Fox, J. & Sutton, D. (1998): COGENT: An environment for the development of cognitive models. In Schmid, U., Krems, J. F., & Wysotzki, F. (eds.) A Cognitive Science Approach to Reasoning, Learning and Discovery, Pabst Science Publishers, Lengerich, Germany. pp. 55-82.


Papers that describe or discuss COGENT models

Cue selection and category learning: A systematic comparison of three theories

Abstract: We evaluate three approaches to tasks involving categorisation with probabilistic cues (the Bayesian approach, the associationist approach, and the hypothesis testing approach) by comparing the behaviour of three classes of cognitive model with that of human participants on a simulated medical diagnosis task. The task yields dependent measures relating to both categorisation accuracy and cue selection. A systematic exploration of the effects of processing biases within the models reveals that all three approaches are able to account for the effects in the human data, provided that appropriate performance factors and processing biases are incorporated. The discussion focuses on the methodology used to evaluate the approaches and on the role of performance factors and processing biases within the various models.

Availability: PDF.

Full reference: Cooper, R., Yule, P. & Fox, J. (2003): Cue selection and category learning: A systematic comparison of three theories. To appear in Cognitive Science Quarterly, 3, 143-182.

Comparative modelling of learning in a decision making task

Abstract: In this paper we compare the behaviour of three competing accounts of decision making under uncertainty (a Bayesian account, an associationist account, and a hypothesis testing account) with subject behaviour in a medical diagnosis task. The task requires that subjects first learn a set of symptom disease associations. Later, subjects are required to form diagnoses based on limited symptom information. The competing theoretical accounts are embodied in three computational models, each with a single parameter governing the learning rate. Subjects' diagnostic accuracy was used to calibrate the learning rates of the models. The resulting parameter-free models were then used to predict subjects' behaviour in the second part of the diagnosis task. Little correlation was found between subject behaviour and the Bayesian model. The correlation was higher in the case of the associationist network model, but the hypothesis testing account proved to provide the most adequate account of the data.

Availability: PDF.

Full reference: Cooper, R. & Yule, P. (1999): Comparative modelling of learning in a decision making task. In Hahn, M. and Stoness, S. C. (eds.) Proceedings of the 21st Annual Conference of the Cognitive Science Society. Vancouver, BC, Canada. pp. 120-125.


Development of problem solving strategies: Strategy shift in the Tower of London task

Abstract: Problem solving performance of 3-4 year olds and 5-6 year olds was tested on two types of Tower of London problem: problems with determinate subgoal ordering and problems with ambiguous subgoal ordering. Children performed more poorly on problems with ambiguous subgoal ordering, as expected from previous research (e.g., Klahr \& Robinson, 1981; Klahr, 1985). In addition, however, it was found that the difference in performance on the two types of problem was significantly greater for older children, suggesting that their problem solving strategies are more sensitive to a problem's subgoal structure. We report a series of computational models which explore strategies which the children might employ. Younger children's performance is best modelled by a hill-climbing approach with a single move look-ahead. In contrast, older children appear to employ a rudimentary form of Means-Ends Analysis.

Availability: PDF.

Full reference: Cooper, R. & Waldau, R. (1999): Development of problem solving strategies: Strategy shift in the Tower of London task. Submitted to Developmental Science.


Modeling the training effects of kinaesthetic acuity measurement in children

Availability: Unavailable

Full reference: Sims, K. & Morton, J. (1998): Modeling the training effects of kinaesthetic acuity measurement in children. Journal of Child Psychology and Psychiatry and Allied Disciplines. 39. 731-746.


Normative and Information Processing Accounts of Decision Making

Abstract: The field of Judgement and Decision Making has for some time been dominated by normative theories which attempt to explain behaviour in mathematical terms. We argue that such approaches provide little insight into the cognitive processes which govern human decision making. The dominance of normative theories cannot be accounted for by the intractability of processing models. In support of this view, we present a processing account of performance on a simulated medical diagnosis task. The performance of the model, which includes learning, is compared with that of a normative (Bayesian) model, and with subject performance on the task. Although there are some caveats, the processing model is found to provide a more adequate account of subject performance than the Bayesian model.

Availability: PDF.

Full reference: Yule, P., Cooper, R. & Fox, J. (1998): Normative and Information Processing Accounts of Decision Making. In Gernsbacher, M. A. and Derry, S. J. (eds.) Proceedings of the 20th Annual Conference of the Cognitive Science Society. Madison, WI, USA. pp. 1176-1181.


Cognitive processing and knowledge representation in decision making under uncertainty

Abstract: This article is a contribution to the current debate on the role of cognitive theory in our understanding of human judgement and decision making under uncertainty. We argue, with Busemeyer et al. and others, that the theoretical and methodological traditions of the Judgement and Decision Making community and mainstream cognitive science are divergent to an undesirable extent, and that the exploitation of established concepts of information processing theories and knowledge representation would considerably strengthen the field. The paper revisits and extends an earlier study Making decisions under the influence of memory (Fox, 1980) in order to explore how these proposals might be applied in practice. A central technique developed by cognitive scientists is that of computational modelling; the paper makes extensive use of a new modelling tool, COGENT, to show how cognitive theory can significanlty illuminate the mental processes involved in complex real-world decisions.

Availability: Unavailable

Full reference: Fox, J. & Cooper, R. (1997): Cognitive processing and knowledge representation in decision making under uncertainty. In Scholz, R. W. & Zimmer, A. C. (eds.), Qualitative Aspects of Decision Making, Pabst Science Publishers, Lengerich, Germany. pp. 83-106.


Learning to make decisions under uncertainty: The contribution of qualitative reasoning

Abstract: The majority of work in the field of human judgement and decision making under uncertainty is based on the use and development of algebraic approaches, in which judgement is modelled in terms of mathematical choice functions. Such approaches provide no account of the mental processes underlying decision making. In this paper we explore a cognitive model (implemented within COGENT) of decision making developed in order to account for subject performance on a simulated medical diagnosis task. Our primary concern is with learning, and empirical results on human learning in the modelled task are also reported. Learning in the computational model shares many qualitative features with the human data. The results provide further support for cognitive (i.e., non-algebraic) approaches to decision making under uncertainty.

Availability: PDF.

Full reference: Cooper, R. & Fox, J. (1997): Learning to make decisions under uncertainty: The contribution of qualitative reasoning. In Shafto, M. G. and Langley, P. Proceedings of 19th Annual Conference of the Cognitive Science Society. Palo Alto, CA, USA. pp 125-130.


Perseverative subgoaling in production system models of problem solving

Abstract: Perseverative subgoaling, the repeated successful solution of subgoals, is a common feature of much problem solving, and its pervasive nature suggests that it is an emergent property of a problem solving architecture. This paper presents a set of minimal requirements on a production system architecture for problem solving which will allow perseverative subgoaling whilst guaranteeing the possibility of recovery from such situations. The fundamental claim is that perseverative subgoaling arises during problem solving when the results of subgoals are forgotten before they can be used. This prompts further attempts at the offending subgoals. In order for such attempts to be effective, however, the production system must satisfy three requirements concerning working memory structure, production structure, and memory decay. The minimal requirements are embodied in a model (developed within the COGENT modelling software) which is explored with respect to the task of multicolumn addition. The inter-relationship between memory decay and task difficulty within this task (measured in terms of the number of columns) is discussed.

Availability: PDF.

Full reference: Cooper, R. (1996). Perseverative subgoaling in production system models of problem solving. In Cottrell, G. W. (ed.), Proceedings of the 18th Annual Conference of the Cognitive Science Society. San Diego, CA, USA. pp. 397-402.


The iteration of concept combination in Sense Generation

Abstract: We report work in progress on the computational modelling of a theory of concepts and concept combination. The sense generation approach to concepts provides a perspicuous way of treating a range of recalcitrant concept combinations: privative combinations (e.g., fake gun, stone lion, apparent friend). We argue that a proper treatment of concept combination must respect important syntactic constraints on the combination process, the simplest being the priority of syntactic modifier over the head in case of conflicts. We present a model of privative concept combinations based on the sense generation approach. The model was developed using COGENT, an object-oriented modelling environment designed to simplify and clarify the implementation process by minimising the `distance' between the box/arrow `language' of psychological theorising and the theory's implementation. In addition to simple privatives (i.e., ones with a single modifier, like fake gun) the model also handles iterated, or complex, privative combinations (i.e., ones with more than one modifier, like fake stone lion), and reflects their associated modification ambiguities. We suggest that the success of this model reflects both the utility of COGENT as a modelling framework and the adequacy of sense generation as a theory of concept combination.

Availability: PDF.

Full reference: Cooper, R. & Franks, B. (1996): The iteration of concept combination in Sense Generation. In Cottrell, G. W. (ed.), Proceedings of the 18th Annual Conference of the Cognitive Science Society. San Diego, CA, USA. pp. 523-528.


Developmental Constraints on a Theory of Memory

Availability: Unavailable

Full reference: Barreau, S. (1996): Developmental Constraints on a Theory of Memory. Ph.D. thesis, Department of Psychology, University College, London, UK.


A Headed Records Simulation of the Event Memory of 4 Year Olds

Availability: Unavailable

Full reference: Miller, G. (1996): A Headed Records Simulation of the Event Memory of 4 Year Olds. Master's thesis, Department of Computer Science, University College, London, UK.


Memory for, and the organization of, future intentions

Abstract: How do we retrieve intentions that have to be realised some considerable period of time after their formation? We present a theoretical account of the processes and knowedge structures that appear to be necessary for the retrieval of intentions designated for realisation during a particaular day. The theoretical constructs we propose are informed primarily by the findings of a series of studies on naturally-occurring intentions (Ellis, 1988a,; 1988b; Ellis & Nimmo-Smith, 1983) and differ from those commonly used in related research on memory and on artificial intelligenve planning processes. Daily Routines, for example, are posited to represent the temporal relations between activities that are performed on a regular basis within this temporal period. We suggest that the retrieval of delayed intentions at an appropriate moment for action (when-realisation) requires extrionsic planning processes directed toward accommodating a future intention into a probable action-copntext in which it can be realised. We argue that, for intentions to be realised for a particular day, this action-context is provided primarily by the Daily Routine for that day. Revisions to such a routine, produced by extrinsic planning processes, result in the formation of a Day Plan. The implications of these proposals for current research on planning are considered.

Availability: Unavailable

Full reference: Ellis, J., Shallice, T. & Cooper, R. (1996): Memory for, and the organization of, future intentions. Unpublished.


Historical and/or Background Papers

A systematic methodology for cognitive modelling

Abstract: The development and testing of computational models of cognition is typically ad hoc: few generally agreed methodological principles guide the process. Consequently computational models often conflate empirically justified mechanisms with pragmatic implementation details, and essential theoretical aspects of theories are frequently hard to identify. We argue that attempts to construct cognitive theories would be considerably assisted by the availability of appropriate languages for specifying cognitive models. Such languages, we contend, should: 1) be syntactically clear and succinct; 2) be operationally well-defined; 3) be executable; and 4) explicitly support the division between theory and implementation detail. In support of our arguments we introduce Sceptic, an executable specification language which goes some way towards satisfying these requirements. Sceptic has been successfully used to implement a number of cognitive models including Soar, and details of the Sceptic specification of Soar are included in a technical appendix. The simplicity of Sceptic Soar permits the essentials of the underlying cognitive theory to be seen, and aids investigation of alternative theoretical assumptions. We demonstrate this by reporting three computational experiments involving modifications to the functioning of working memory within Soar. Although our focus is on Soar, the thrust of the work is more concerned with general methodological issues in cognitive modelling.

Availability: Unavailable

Full reference: Cooper, R., Fox, J., Farringdon, J., & Shallice, T. (1996): A systematic methodology for cognitive modelling. Artificial Intelligence, 85, 3-44.


Towards an Object-Oriented language for cognitive modeling

Abstract: This paper describes work towards an object-oriented language for cognitive modeling. Existing modeling languages (such as C, LISP and Prolog) tend to be far removed from the techniques employed by psychologists in developing their theories. In addition, they encourage the confusion of implementation detail necessary for computational completeness with theoretically motivated aspects. The language described here (OOS) has been designed so as to facilitate this theory/implementation separation, while at the same time simplifying the modeling process for computationally non-sophisticated users by providing a set of classes of basic ``cognitive'' objects. The object classes are tailored to the implementation of functionally modular cognitive models in the box/arrow style. The language is described (in terms of its execution model and its basic classes) before a sketch is given of a simple production system which has been implemented within the language. We conclude with a discussion of on-going work aimed at extending the coverage of the language and further simplifying the modeling process.

Availability: PDF.

Full reference: Cooper, R. (1995). Towards an Object-Oriented language for cognitive modeling. In Moore, J. D. & Lehman, J. F. Proceedings of the 17th Annual Conference of the Cognitive Science Society, Pittsburgh, PA, USA. pp. 556-561.


Sceptic Version 4 User Manual

Availability: PDF.

Full reference: Cooper, R. & Farringdon, J. (1993): Sceptic Version 4 User Manual. Tech. Report UCL-PSY-ADREM-TR6, Department of Psychology, University College London.

Also available: Sceptic Version 4 source code


Sceptic User Manual: Version 3.0

Availability: PDF.

Full reference: Hajnal, S., Fox, J., & Krause, P. (1989): Sceptic User Manual: Version 3.0. Advanced Computation Laboratory, Imperial Cancer Research Fund, London, UK.


Other papers that refer to COGENT

Is there a place for semantic similarity in the analogical mapping process?

Abstract: Ramscar & Pain (1996) argued that the analogical process cannot be easily distinguished from the categorisation process at a cognitive level. In light of the absence of any distinction between analogy and categorisation, we have argued that analogy is supervenient upon an important part of the classification process, and that as such `analogical' models are capable of illuminating some categorisation tasks, for instance, the way in which structural systematicity can determine not only analogical judgements, but also category decisions. Our scepticism regarding the cognitive distinction between these two processes has implications for both analogy and categorisation research: in this paper we consider two leading analogical theories, Gentner's Structure Mapping Theory and Holyoak's Multi-Constraint Theory, and argue that results from our use of analogical modeling techniques in categorisation tasks offer some important insights into exactly which elements should be included in a theory of analogical mapping.

Availability: Unavailable.

Full reference: Ramscar, M., Pain, H., & Cooper, R. (1997): Is there a place for semantic similarity in the analogical mapping process? In Shafto, M. G. and Langley, P. (eds.) Proceedings of 19th Annual Conference of the Cognitive Science Society. Palo Alto, CA, USA. 632-637.


© The COGENT Group, 1999