Competence models, performance factors & computational experiments

Richard Cooper

Recently, COGENT has been used to investigate the effects of working memory decay on a simple production system. I'll summarise the results of this work before turning to a number of questions raised by the approach. In particular, I shall argue that the basic production system (without memory decay) should be viewed as a competence model, with memory decay being a putative performance factor. The question then arises as to what other performance factors (including those arising from neurological damage) might intervene to yield non-optimal behaviour in such a system. I will present some preliminary (and perhaps not surprising) results from an investigation of one such performance factor: imperfect rule firing.

In the second half of the talk I will turn to issues relating specifically to the use of COGENT in this work. Though COGENT made some things very easy, there were other things that should also have been easy: I couldn't use the memory decay parameter to control memory decay because there were other constraints on the decay; I couldn't run a series of blocks looking at different parameter values without changing those values by hand after every block. One conclusion (noted by John Fox in his work) is that we need a scripting language which allows setting of box parameters between trials and/or blocks. I shall argue that this language should be grounded in standard experimental design, illustrating my argument with reference to a range of concepts that the scripting language should support.

COGENT: Real science... Workshop 1 Cognitive models in IR