Why do so many people think emotion is harder to explain than intellect? They're always saying things like this:
I understand, in principle, how a computer might solve problems by reasoning. But I can't imagine how a computer could have emotions, or comprehend them. That doesn't seem at all the sort of thing machines could ever do.
We often think of anger as nonrational. But in our Challenger scenario, the way that Work employs Anger to subdue Sleep seems no less rational than using a stick to reach for something beyond one's grasp. Anger is merely an implement that Work can use to solve one of its problems. The only complication is that Work cannot arouse Anger directly; however, it discovers a way to do this indirectly, by turning on the fantasy of Professor Challenger. No matter that this leads to states of mind that people call emotional. To Work it's merely one more way to do what it's assigned to do. We're always using images and fantasies in ordinary thought. We use imagination to solve a geometry problem, plan a walk to some familiar place, or choose what to eat for dinner: in each, we must envision things that aren't actually there. The use of fantasies, emotional or not, is indispensable for every complicated problem-solving process. We always have to deal with nonexistent scenes, because only when a mind can change the ways things appear to be can it really start to think of how to change the ways things are.
In any case, our culture wrongly teaches us that thoughts and feelings lie in almost separate worlds. In fact, they're always intertwined. In the next few sections we'll propose to regard emotions not as separate from thoughts in general, but as varieties or types of thoughts, each based on a different brain-machine that specializes in some particular domain of thought. In infancy, these protospecialists have little to do with one another, but later they grow together as they learn to exploit one another, albeit without understanding one another, the way Work exploits Anger to stop Sleep.
Another reason we consider emotion to be more mysterious and powerful than reason is that we wrongly credit it with many things that reason does. We're all so insensitive to the complexity of ordinary thinking that we take the marvels of our common sense for granted. Then, whenever anyone does something outstanding, instead of trying to understand the process of thought that did the real work, we attribute that virtue to whichever superficial emotional signs we can easily discern, like motivation, passion, inspiration, or sensibility.
In any case, no matter how neutral and rational a goal may seem, it will eventually conflict with other goals if it persists for long enough. No long-term project can be carried out without some defense against competing interests, and this is likely to produce what we call emotional reactions to the conflicts that come about among our most insistent goals. The question is not whether intelligent machines can have any emotions, but whether machines can be intelligent without any emotions. I suspect that once we give machines the ability to alter their own abilities we'll have to provide them with all sorts of complex checks and balances. It is probably no accident that the term
machinelike has come to have two opposite connotations. One means completely unconcerned, unfeeling, and emotionless, devoid of any interest. The other means being implacably committed to some single cause. Thus each suggests not only inhumanity, but also some stupidity. Too much commitment leads to doing only one single thing; too little concern produces aimless wandering.