I'm not sure we're living in an age of disruption, or just an age that badly wants to think itself disruptive, but either way there's been a lot of rethinking going on the past decade or so. The biggest upheavals have come in industries in which managers have always made decisions more or less by gut instinct: political campaigns, health care, military campaigns, professional sports. The obvious cause of the turmoil is the availability of ever-cheaper computing power: People looking for an edge in any business can now gather and analyze all sorts of previously unobtainable or unanalyzable data. The less obvious cause is an idea, that the data might trump the expertise of managers. People (even experts) and industries (even old ones) can make big, systematic mistakes. You don't set out to find better ways to value professional baseball players if you believe that the market already knows everything there is to know about their value.

There's now a fairly long list of intellectuals responsible for the spread of this subversive idea. Somewhere near the top of it is the economist Richard Thaler, who has just published an odd and interesting professional memoir, "Misbehaving." It's odd because it's funnier and more personal than books by professors tend to be. It's interesting because it tells the story not just of Thaler's career but also of the field of behavioral economics -- the study of actual human beings rather than the rational optimizers of classical economic theory. 

For a surprisingly long time behavioral economics wasn't much more than a bunch of weird observations made by Richard Thaler, more or less to himself. What he calls his "first heretical thoughts" occurred in graduate school, while writing his thesis. He'd set out to determine how to value a human life -- so that, say, the government might decide how much to spend on some life-saving highway improvement. It sounds like a question without a clear answer but, as Thaler points out, people answer it clearly, if implicitly, every day, when they accept money for a greater chance of dying on the job. "Suppose I could get data on the death rates of various occupations, including dangerous ones like mining, logging and skyscraper window washing, and safer ones like farming, shop keeping and low rise window washing," recalls Thaler. "The risky jobs should pay more than the less risky ones: otherwise why would anyone do them?" Using wage data, and an actuarial table of mortality rates in those jobs, he was able to work out what people needed to be paid to risk their life. (The current implied value of an American life is $7 million.) Only he didn't stop there. He got distracted by a funny idea.

This willingness to allow oneself to be distracted from one's assigned task would later turn out to be a chief characteristic of behavioral economists, along with a bunch of other traits not normally found in economists, though often found in children: a sense of wonder, a tendency to ask embarrassing questions, and a mistrust of grown-ups' ideas about what's worth spending time thinking about and what is not. They're the sort of people whose day is made when they discover that health club members are most likely to hit the gym the day after they have received their monthly bill, or that race track gamblers are a lot more likely to bet on the longshot the last race of the day than the first.

At any rate, in addition to calculating the market's price for a human life, Thaler got distracted by how much fun he might have if he asked actual human beings how much they needed to be paid to run the risk of dying. He began with his own students, telling them to imagine that by attending his lecture, they had exposed themselves to a rare fatal disease. There was a 1 in 1,000 chance they had caught it. There was a single dose of the antidote: How much would they be willing to pay for it?

Then he asked them the same question, in a different way: How much would they demand to be paid to attend a lecture in which there is a 1 in 1,000 chance of contracting a rare fatal disease, for which there was no antidote?

The questions were practically identical, but the answers people gave to them were -- and are -- wildly different. People would say they would pay two grand for the antidote, for instance, but would need to be paid half a million dollars to expose themselves to the virus. "Economic theory is not alone in saying that the answers should be identical," writes Thaler. "Logical consistency demands it. … To an economist, these findings are somewhere between puzzling and preposterous. I showed them to (his thesis adviser) and he told me to stop wasting my time and get back to work on my thesis."

Instead, Thaler began to keep a list of things that people did that made a mockery of economic models of rational choice. There was the guy who planned to go to the football game, changed his mind when he saw it was snowing, and then, when he realized he had already bought the ticket, changed his mind again. There was the other guy who refused to pay $10 to have someone mow his lawn but wouldn't accept $20 to mow his neighbor's. There was the woman who drove 10 minutes to a store in order to save $10 on a $45 clock radio but wouldn't drive the same amount of time to save $10 on a $495 television. There were the people Thaler invited over to dinner, to whom he offered, before dinner, a giant bowl of nuts. They ate so many nuts they had no appetite for the far more appealing meal. The next time they came to dinner Thaler didn't offer nuts -- and his guests were happier.

And so on. People who read Thaler's list might well just shrug and say, "there isn’t anything here that any good used car salesman doesn't know." That's the point: It's obvious to anyone who pays any attention at all to himself or his fellow human beings that we are not maximizers, or optimizers, or logical, or even all that sensible. In the early 1970s, when Thaler was a student, his professors didn't argue that human beings were perfectly rational. They argued that human irrationality didn't matter, for the purpose of economic theory, because it wasn't systematic. It could be treated as self-cancelling noise.  

Enter Amos Tversky and Daniel Kahneman, psychologists at the Hebrew University in Jerusalem. Together, in the late 1960s, they had set off to confirm their suspicion that the weird self- defeating stuff that people do isn't random and inexplicable but fundamental to human nature. More to the point, human beings were not just occasionally irrational, but systematically irrational. They had predictable biases -- for instance, they were inclined to draw radical conclusions from tiny amounts of information. Their preferences were unstable. When faced with a choice between two things, they responded not to the things themselves but to descriptions of those things. Perhaps most significantly, people responded very differently when a choice was framed as a loss than when it was framed as a gain. Tell a person that he had a 95 percent chance of surviving some medical procedure and he was far more likely to submit to it than if you told him he had a 5 percent chance of dying.