What Does “Her” Want?

The answer is nothing. And that’s the problem.

The movie “Her” features an operating system named Samantha with a lovely voice and a near-human ability to talk and laugh and even love. We all know that no such system exists now, but given how fast technology is racing ahead these days, it’s easy to imagine that a Samantha-like system is just around the corner. It isn’t. And the reason is not technical, it’s conceptual. It has to do with a very old philosophical argument and a confused notion we have today about the relationship between ideas and wanting, reason and passion.

The argument has two parts. First I show that reason, understood as pure logic, cannot get us from ideas, from facts about the world, to wanting. Wants don’t follow logically from facts. Second, I contend that we know hardly anything today about wanting, about what it is, but the little we do know suggests that modern machines aren’t doing anything like it. They just have the wrong sort of architecture.

A computer is a logic machine. I set my laptop to work on a word search, keystroking the word “curmudgeon” into the search box. The software routine then marches through my documents, checking the words, one by one. Every time it finds curmudgeon, it places the file name on a growing list on my screen. The way we usually understand this, a search term is a datum, a fact. And from this fact, my laptop deduces a new fact, the list of files. The simple logic it uses is this: “If the file contains the world curmudgeon, then add its name to the list.” Given the input, the “if-then” logic dictates the output. Modern machines are excellent logicians.

But they don’t “want” anything. My laptop has no preferences, no motivations, no desires, no cares, no passions. It does not want to find files. It feels no inclination toward files with curmudgeon, no aversion to files without the word. It has no impulses or urges to act of any kind. It lacks what have been called “affective states.” Now there are affect simulators, machines that convincingly fake affect for our amusement or benefit. But I’m talking about genuine wanting, preferring, caring. There is not a machine in the world that can answer the question, “What would you like to do this afternoon?” and mean it.

A Deep Disconnect

The reason has to do with a deep disconnect between facts and logic on the one hand and wants on the other, a disconnect pointed out in the 18th century by David Hume. Facts are ideas and logic is a relationship among ideas. But wanting is neither an idea nor an idea-relationship. It is a physical state in a brain. And no physical state follows logically from any facts. From the fact that rain makes things wet when they are not sheltered, the fact that it is raining, and the further fact that I am outside unsheltered, another fact follows logically, the fact that I will get wet. But actual physical wetness does not follow logically from these facts. It takes actual water, not a chain of logic, to send me running for soup and dry clothes. Likewise, the brain state corresponding to caring does not follow from a chain of logic. It does not follow that I care if I get wet.

To make this vivid, imagine you are alone in a room with a perfectly logical being. The being is reasonably knowledgeable and can speak, but even before you begin to talk to it, a fire breaks out in a trash can. You leap from your chair and head for the fire extinguisher sensibly mounted on the wall nearby.

“What are you doing?” the perfectly logical being asks.

There is no real urgency yet. The fire is contained by the trash can. So as you speed-read the operating instructions on the extinguisher, you take a moment to answer the perfectly logical being. “I’m putting out the fire,” you say.

“Why?” The perfectly logical being asks evenly, no trace of concern in its voice.

“Because the room will fill with smoke, of course,” you answer, a little impatiently.

“So what?” the being intones.

“If the room fills with smoke, we won’t be able to breathe.” You are answering only half attentively now. The extinguisher instructions are in six languages, which is confusing.

“So what?” the being repeats.

“If we can’t breathe, we’ll die!” Now you’re upset. You’ve found instructions you can read, but it’s not clear what they are saying. There is a pin somewhere, and you need to pull it, but where is it? Who writes these things? The room is starting to fill with smoke. “I don’t know about you, but I want to stay alive!”

“Why?”

“Because! I just do!” you explode, now panicking and furious with the perfectly logical being. But then you have an idea. Handing the fire extinguisher to the perfectly logical being, you bark at him, “Here. Figure out how this works. Put the fire out. Do it now!” The being calmly complies. The fire is out, and shortly the smoke clears.

What happened here? The perfectly logical being had or acquired (from you) all of the knowledge it needed, including facts about fire and smoke and what it takes to survive. But despite being awash in facts, no want followed logically from them, not even a mild preference for survival. Again, the reason is that facts are ideas and wants are fundamentally different. A want is an urge, a drive. It can be about a fact. (You want to stop the fire.) But a want is not itself a fact. A want can drive behavior. (A desire to survive drives you to the extinguisher.) But it is not itself a behavior. A want can follow a fact in time, perhaps even caused by the physical brain state that corresponds to a fact. (That brain state somehow causes in your brain a desire to stay alive.) But following in time is not the same as following logically. And causation is not logic. Thunder follows lightning as a matter of physics, not logic. And wanting to survive follows a fire because of the way your brain is structured, not from logic.

The bottom line is that no perfectly logical being cares about anything. A perfectly logical Mr. Spock of the old Star Trek series and Mr. Data of the later one are impossible beings. They would never get up in the morning, much less go about their business on a starship. And now I have to admit that your conversation with the perfectly logical being is also impossible. A perfectly logical being would not ask any questions. It would not even be curious.

So if logic won’t get us from facts to wants, is there another way that computers could want? They are logic machines but they are also physical devices. Might they, as physical devices, be programmed in some non-logical way to want? Maybe. But the architecture of modern machines seems all wrong. Computer action is a series of switches. Inside the chip, an input flips a switch, which flips the next switch, which flips the next, and so on, eventually producing an output. It’s like dominos lined up and standing on end next to each other in a long chain. Knock over the first and they all go down, one by one. Each event is caused by and causes another, in a rigid sequence chosen by the programmers. The output can seem faintly human. You call your bank, punch in your account number, and hear: “For account balance, press 1. For last three transactions, press 2. For what’s best in an uncertain world, press 3.” The machine speaks. But it doesn’t mean anything by it. It doesn’t intend. It doesn’t want. It’s just dominos falling.

What is wanting? In psychology and neuroscience, the study of the strong passions, the emotions, has enjoyed a renaissance in recent decades. We’ve learned a great deal, especially from fMRIs, about the causes of emotions, which brain areas produce them, and so on. But we know much less about wanting, preferring, caring. We know something about what makes me prefer this car model to that one. But we don’t know what a preference is, what sort of phenomenon it is, not in the sense that we know what a waterfall or a star is.

With science offering little help, we turn to introspection, and metaphor. Wanting is like a mountain lake draining to the ocean, a body of potential perched on high, seeking by any means available a route “downhill,” toward the thing I want. Wanting is a goal not yet reached, a need unsatisfied. Or wanting is a vacuum demanding to be filled. It is a seeking after fulfillment. It is a drive, often imprecisely specified, a drive not to do exactly X but something that achieves a result like X. Or maybe wanting is a kind of action, action not yet specified. Dominos falling is action too, but domino action is lawlike, specific. It is an actual occurrence. It is realized. Wanting is uncertain, general, yet-unrealized.

Whatever wanting is, the falling-dominos style of causation that modern machines deploy so powerfully seems a poor basis for generating it, for creating states that are by their nature poorly specified and general in their effects. My guess is that if we want to build a machine that wants, prefers, cares, we will need a new kind of machine architecture.

Reason and Decision Making

A common notion is that certain decisions are the outcome of a war between head and heart, between reason and passion. I passionately want to buy that whole cheesecake in the bakery window and to eat it all now. But, we say, reason tells me that it will make me sick. Reason wins, and I walk away. Or it doesn’t, and I don’t. But whatever I do, the forces at work seem clear, reason and passion.

As we’ve seen, however, that must be wrong. Reason in the sense of logic may tell me that if the cheesecake has 100 grams of fat and if 100 grams of fat makes one sick, then eating the whole cheesecake will make me sick. But this “if-then” cannot by itself oppose my desire to eat the whole cheesecake. Also needed is a want, a desire to not be sick, or to not gain weight, or to lead a long and healthy life. These wants are passions. They are not strong passions, of the sort that raise my heart rate, but they are still affective states. They are what Hume called “calm passions.” And logic by itself cannot produce them. Logic is a device for calculating, for moving from facts to other facts. It is a tool. When Hume famously wrote that “Reason is and ought to be the slave of the passions,” he meant that reason in the sense of logic is a tool of the passions in the sense of wants and preferences. Reason is a screwdriver.  And a screwdriver cannot disagree with the user. There can be no opposition between reason and passion.

There are other senses of “reason,” of course. There is reason as scenario building and judgment. I receive my outrageously high cable bill and consider the possibility of vandalizing the company office in retribution for their monopoly pricing. But then in my head I play out the consequences, and in many scenarios it ends badly. I see myself led away in handcuffs, appearing in court, suffering disapproving looks from friends and family. These imagined outcomes then somehow trigger a judgment, a negative affective reaction. I dislike these consequences! I sigh, and write my congressman instead.

All of this happens quietly. Nothing registers on my face. To an observer, I am just staring at the cable bill, the picture of a sober, thoughtful scientist. And indeed this is our standard vision of reasoned decision making. The reasoner pauses, meditates, considers, evaluates. But however coldly logical it all looks, calmly passionate judgments are being made. Now calling these judgments passionate is not to disparage the process. Indeed, this is the high standard to which we all rightly aspire. We do not want to be governed by the strongest passion of the moment. We want to make decisions in which scenarios are built and the results quietly evaluated by all of our passions, especially the calm ones.

So reason in the sense of scenario building and judgment clearly can oppose passion. But let us be clear, the calm passions buried within judgment are still passions. And the battle is always between passion and passion.

We Can’t, but Suppose We Could

Suppose we could build a machine with real wants. What we would do with it? One possibility is that we could send it to do our dirty work, dangerous jobs in awful environments. But that’s not nice. A particular profile of wants, preferences, and cares is what defines us as individuals, at least as much as our accumulated experiences and personality quirks. Indeed, it is a broad profile of wants, cares, and passions that defines our human nature. So a machine with a similar profile would be one of us, a kind of person. And people should not be treated that way.

Love is another possibility. But the movie Her convinced me that I would not want to fall in love with a real Samantha, a machine with real wants, cares, and passions. Love relationships with my own species are complicated enough. Maybe simple companionship would be enough. I imagine myself basking in the sun of a weekend morning with my wanting, caring machine. I am telling it about my hard week and listening sympathetically to its story of troubles. Eventually, as we wind down, recovering together from the trials of passionate existence, we fall silent. We are staring absently off into the distance, basking in each other’s presence. Some time passes. I don’t know how much. But at some point, and for no particular reason, I turn to my wanting, caring machine, and ask: “What would you like to do this afternoon?”

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *