Previous: interruption and recoveryNext: the recursion principleContents

Society of Mind

15.10 losing track

Whenever we solve complicated problems, we get into situations in which our agencies must keep account of many processes at once. In computer programs, the many subjobs often seem to pile up like the blocks of a tower. Indeed, computer programmers often use the word stack to describe such situations. But I doubt that untrained human minds use anything so methodical; in fact, we simply aren't very good at dealing with the kinds of situations that need such memory-stacks. This could be why we get confused when hearing sentences like this:

This is the malt that the rat that the cat that the dog worried killed ate.

The very same words can be rearranged to make an equivalent sentence anyone can understand:

This is the dog that worried the cat that killed the rat that ate the malt.

The first sentence is hard to understand because so many verb processes interrupt one another that when the end of the sentence comes, three similar processes are still active — but they have lost track of what roles should be assigned to all the remaining nouns, namely, the rat, cat, and malt. Why do visual processes so rarely encounter similar difficulties? One reason is that our visual-systems can support more simultaneously operating processes than our language-systems can, and this reduces the need for any process to interrupt another one. A second reason is that the vision-agencies can choose for themselves the sequence in which they attend to details, whereas language-agencies are controlled by the person who is speaking.

It takes each person many years to learn to use those memory- systems well. Younger children certainly cannot keep track as well as adults. It's generally of little use to ask a pair of two-year-olds to play together or to take turns at using a toy. We consider them to be too self-centered and impatient for that. Surely much of their undisciplined impulsiveness comes from desires that are less regulated than our own. But that impatience could also stem from insecurity about memory: the child may fear that what it wants will slip from mind while other thoughts are entertained. In other words, the child who is asked to take turns might fear that by the time its turn arrives, it may not want the object anymore.

When people ask, Could a machine ever be conscious? I'm often tempted to ask back, Could a person ever be conscious? I mean this as a serious reply, because we seem so ill-equipped to understand ourselves. Long before we became concerned with understanding how we work, our evolution had already constrained the architecture of our brains. However, we can design our new machines as we wish, and provide them with better ways to keep and examine records of their own activities — and this means that machines are potentially capable of far more consciousness than we are. To be sure, simply providing machines with such information would not automatically enable them to use it to promote their own development, and until we can design more sensible machines, such knowledge might only help them find more ways to fail: the easier to change themselves, the easier to wreck themselves — until they learn to train themselves. Fortunately, we can leave this problem to the designers of the future, who surely would not build such things unless they found good reasons to.