Chapter Content

38,158 chars β€’ 6,435 words

So, yeah, let me tell you about this, uh, this period in the lives of Daniel Kahneman and Amos Tversky. It's, like, kind of the birth of these, like, psychology warriors, right?

So, Daniel, he, like, totally knew that his and Amos's relationship was, well, it was just never gonna be understood by anyone else, you know? They'd taught this seminar together at Hebrew University, and Daniel thought it was a disaster, honestly. The warmth Amos had, just, like, vanished in front of a group. Daniel said, uh, "We were just, like, doing our own thing, totally out of sync." Like, they were either interrupting each other or, you know, arguing. Nobody saw them actually *working* together, and nobody really understood what their relationship was. It was almost, like, a romantic relationship, if you take away the, you know, the romance part. They were just closer to each other than they were to anyone else. Their wives, they noticed it first. Barbara Tversky said, "They're, like, closer than a married couple." She thought they were both just really attracted to each other's intelligence. "It was like a magnetic thing," she said. Daniel kind of felt like his wife was a bit bothered by it all, and Amos would, you know, tell Barbara that she was being understanding and cool about it all. Daniel said, "When I was with Amos, I felt something I'd never felt with anyone else. It's true. You can fall in love with a person, or an object, but, like, I was fascinated by him. That's the relationship. It was pretty incredible, really."

But get this, Amos was the one trying hardest to keep that closeness going. Daniel said he was, you know, he was someone who was always pulling back, keeping his distance, afraid that he wouldn't be able to cope if, like, he ever left Amos.

Then, everything changed, you know?

Egypt and Syria attacked Israel, like, at 4 AM California time. They chose Yom Kippur, the holiest day in Judaism. A small Israeli unit, like, five hundred people, they were just, like, wiped out by the Egyptian army at the Suez Canal. On the Golan Heights, a few tank guys, like, a hundred and seventy-seven of them, were attacked by, like, two *thousand* Syrian tanks. Amos and Daniel, they were still in the States, trying to become experts in decision analysis. They heard the news and, boom, they were at the airport, getting on the first plane to Paris, 'cause Daniel's sister worked at the Israeli consulate there. It was hard to get into Israel during wartime, you know? The Israeli planes were full of pilots and commanders, replacing those who'd been killed in the initial attacks. That was just the way things were, back then, right? Any Israeli who could fight, they went to fight. The Egyptian president, Anwar Sadat, he knew that, so he said he was gonna shoot down any commercial plane that tried to land in Israel. Daniel and Amos stayed in Paris, waiting for Daniel's sister to, like, pull some strings to get them on a flight. They bought these, you know, these canvas boots, which were lighter than the leather ones the Israeli army gave out.

Barbara Tversky, she was taking her oldest son to an emergency clinic in Jerusalem. He and his brother were, like, competing to see who could balance a cucumber on their nose, and he won. On the drive home, people surrounded their car, yelling at her to get out of the way. The country was, like, panicking, right? Fighter jets were flying low over Jerusalem, telling all the reservists to report for duty. Hebrew University shut down again. The quiet near Amos's place was gone, replaced by army trucks all night. The city was dark, streetlights turned off. People were taping over their brake lights on their cars, right? The stars were, like, super bright, and the situation was, you know, super worrying. Barbara realized that the Israeli government was hiding something. This war felt different, like, Israel might not make it. She didn't know where Amos was, or what he was gonna do. She couldn't do anything. International calls were expensive, so they'd been writing letters. Lots of people were in the same boat, right? Israelis abroad going home to fight, and then finding out that their family had been killed.

So, Barbara went to the library to research stress and how to deal with it. She wrote this, like, news article, right? A few days later, it was, like, ten at night, kids were asleep, she was working in her study, curtains closed so no light could get out. She heard footsteps. Running up the stairs, and then Amos was just, like, suddenly *there*, in the dark. He'd come back with Daniel on a plane that was just for soldiers, right? It landed in Tel Aviv in total darkness, no lights on the wings. Amos went up to the attic to get his uniform. He was a captain now. It still fit. He left at five AM.

Amos and Daniel, they both got sent to the psychology unit, right? It had grown a lot since Daniel redesigned the, uh, the talent selection system in the '50s. A psychologist named James Lester, he was studying Israeli military psychology for the U.S. Navy. He wrote this report, you know, detailing the unit. Lester was, like, confused about why Israel had the strictest driver's test in the world, but, like, the highest accident rate. He was also impressed that the Israeli army trusted psychologists so much. He wrote that the failure rate in officer training was, like, 15 to 20 percent, and, you know, the Israeli army really believed in psychological research, so they'd get the selection department to, like, figure out who wasn't gonna make it, right, in the very first week of training.

According to Lester, the head of the army psychology unit, this guy named Benny Shalit, was a super intense guy. Shalit had been pushing to raise the profile of the psychology unit in the army, and he, like, kind of got his way. He wanted to sew his own, like, ridiculous design on his uniform, right? An olive branch and a sword. Lester said there was, like, an eye on it, "symbolizing evaluation, vision, or something like that." Shalit wanted to turn the psychology unit into a combat unit, and he wanted to do these crazy things that even the psychologists thought were absurd, right? Like, he wanted to hypnotize Arabs to assassinate their leaders. This woman named Daniella Gordon, she worked in the psychology unit, she remembered one time that he actually hypnotized an Arab, right? "They took him to the Jordanian border, and he just ran away," she said.

There was this rumor going around that Shalit had all the personality test results of, like, all the important people in the Israeli army from when they'd joined, and he said he wouldn't hesitate to release them, you know? Whatever the reason, Benny Shalit was, like, really good at navigating the Israeli army. One of his requests that actually got approved was to put psychologists in combat units, so they could advise commanders directly, you know? Lester told his superiors at the U.S. Navy that psychologists in the field could advise on all sorts of things, right? Like, one psychologist noticed that when infantrymen stopped for drinks during hot weather, they were using ammo cartridges to open the bottles, which was ruining the drink supply. So, the psychologist suggested giving them, like, bottle openers, right? Shalit's psychologists also suggested removing extra scopes from machine guns, and changing the way machine gun squads worked together to improve accuracy. Basically, the psychologists in the Israeli army had, like, a lot of freedom, right? The U.S. Navy researchers concluded that military psychology was, like, really thriving in Israel. They said it would be, like, interesting to see if the Israeli psyche would turn into a military psyche.

But what Benny Shalit's field psychologists would actually *do* in a real war, nobody really knew. Eli Fichhoff, Shalit's second in command, said the psychology unit had no clue, the war was just too sudden. "All we realized was that we might not make it." In just a few days, Israel's army had lost more people, as a percentage of the population, than the U.S. army had in the entire Vietnam War. The Israeli government called it a "demographic disaster," because a lot of Israel's elite were killed. In the psychology unit, someone suggested creating a questionnaire to see what could be done to improve morale, right? Amos jumped at the chance, helped design the questions, and then kind of used it as an excuse to get closer to the front lines, you know? Daniel said they were driving around Sinai in a jeep, trying to do what they could for the country.

Their colleagues thought they were crazy, Amos and Daniel driving around the battlefield with guns and a jeep. Yaffa Singer remembered that Amos was super excited, like a kid, but Sinai was dangerous, sending them to Sinai with questionnaires was like a death wish. It wasn't just enemy tanks and planes, it was, like, landmines everywhere, right? "They went alone, no guards, they had to protect themselves," said Daniella Gordon, their commander. People were more worried about Daniel, right? Eli Fischhoff, who was in charge of the field psychology unit, said they weren't too worried about Amos, because he was a fighter, but they were really concerned about Daniel.

But, driving around Sinai, it was Daniel who ended up being more effective. Fischhoff remembered that Daniel was jumping out of the jeep and asking soldiers questions. Amos was more practical, but Daniel was, like, able to notice things that other people didn't and come up with solutions. On the way to the front, Daniel noticed piles of garbage on the side of the road, right? Unfinished canned food, all from the U.S. military. He checked to see what the soldiers were eating and what they were throwing away. Turns out, they all liked the canned peaches, right? He wrote this article saying the Israeli army should analyze the garbage to make sure they were giving the soldiers food they actually liked, right? It made the front page.

Back then, Israeli tank drivers were getting killed at, like, record numbers. Daniel visited a tank training camp, where the new recruits were getting trained as fast as possible to replace the dead guys at the front. The recruits were split into groups of four, rotating every two hours. Daniel pointed out that people learn better when they repeat something more often in a shorter amount of time, right? If the recruits rotated every half hour, they'd learn to drive the tanks faster. He brought that idea to the Israeli Air Force, too. Israeli fighter pilots were also getting killed, because Egypt had these new surface-to-air missiles from the Soviet Union. One squadron had suffered really bad losses, right? The Air Force generals were gonna investigate and, like, punish the squadron if necessary, right? Daniel remembered that one general actually said to a pilot, his plane "was hit by *four* missiles, not one! As if that proved how incompetent the pilot was, right?"

Daniel told the general that he was making a mistake of, like, a lack of understanding of the sample size, right? This squadron that was seen as incompetent was, like, probably just unlucky. If he investigated the squadron, he'd definitely find patterns to explain the results, right? Maybe the pilots took too many trips home, or maybe they all liked wearing fancy underwear. But whatever he found would just be meaningless, right? The squadron wasn't big enough to be statistically significant. More importantly, an investigation that was about blame would, like, really hurt morale. The only purpose of the investigation would be for the general to show how much power he had. The general listened to Daniel and stopped the investigation. Daniel said that he thought that was the only thing he did that actually helped the war effort.

Daniel realized that what he was doing – giving questionnaires to soldiers who'd just come back from the battlefield – was useless. A lot of the soldiers were traumatized, right? Daniel said that they wanted to help the people who were scared, and they wanted to assess their trauma. He said that every soldier was terrified by the war, but some of them were just unable to do anything. The traumatized Israeli soldiers were kind of like people with depression. Daniel knew that there were some things he just couldn't fix, and this was one of them.

He didn't really wanna be in Sinai, not in the way that Amos did, right? Daniel said that he remembered feeling like he was just wasting time. When the jeep bounced him off the back seat one too many times, he, like, completely ended the trip. He just left Amos to hand out the questionnaires.

Later on, the Walter Reed Army Institute did a study on the trauma of the 1973 Arab-Israeli War. The psychologists who did the study, they realized how intense the war had been, right? Twenty-four hours a day, at least at the beginning. The losses were also really heavy. They also found out that, for the first time ever, Israeli soldiers were being diagnosed with trauma. The questionnaire that Amos helped write, it had some pretty simple questions, right? Where were you? What did you do? What did you see? Did you win? Why didn't you win? Stuff like that. Yaffa Singer remembered that people were starting to talk about their fears, talking about personal feelings. She said that from the War of Independence until 1973, it wasn't allowed, right? They were all supposed to be superheroes, and nobody was allowed to say they were afraid. If they did, they might get killed.

After the war, Amos and Singer, and two other people who'd done field psychology, they spent a few days going through the questionnaires, right? They talked about their motives for fighting. Singer said the stuff that people had been hiding was shocking. In hindsight, what the soldiers told the psychologists was just, you know, really obvious emotions. Singer said that they wanted to know why people were fighting for Israel, right? They thought it was because of love of country. But when they saw the soldiers' responses, it was clear: they were fighting for their friends, for their families. Not for the country, not for Zionism. Back then, that was, like, a huge discovery. After seeing their friends get blown to bits by bombs, seeing their best friends die in the street, these Israeli soldiers were, like, finally saying what they felt. Singer said it was heartbreaking to read.

As the war was ending, Amos made this unexpected decision, right? A decision that everyone thought was stupid. Barbara remembered that he went to the Suez Canal to see the end of the war, right? Even though he knew that fighting was still going on after the ceasefire. Amos's attitude towards personal safety, his wife couldn't understand it. Again, he said he wanted to jump out of a plane, just because it would be fun. Barbara said she told him not to forget that he was a father, and he changed his mind. Amos wasn't exactly a thrill seeker, but he had this intense, childlike passion that sometimes made him want to go to places nobody else wanted to go, right?

Eventually, he crossed Sinai and got to the Suez Canal. There was this rumor going around that the Israeli army was gonna go straight to Cairo, and the Soviets were gonna send nuclear weapons to Egypt to stop the Israeli attack. When Amos got to Suez, he found out that the fighting wasn't stopping, it was getting worse. The Arabs and the Israelis had this old tradition of trying to kill as many of the other side as possible before the official ceasefire, right? The idea was to kill as many enemies as you could, if you could. While he was walking around near the Suez Canal, trying to avoid the bombs, Amos jumped into a trench, right on top of an Israeli soldier:

"Are you a bomb?" the soldier asked, freaked out.

"No, I'm Amos," he said.

"So I'm not dead?" the soldier asked.

"You're not dead," he said.

That's Amos's version of the story. He hardly ever talked about the war.

In 1973 or maybe early '74, Daniel gave this talk that he gave many times after that, "Cognitive Limitations and Judgement in Public Policy." He started by saying it was worrying to think that an organism with an emotional and physical system wasn't really that different from a jungle rat with superpowers, who could destroy everything by pressing a few buttons. He and Amos had just finished their research on human judgment, and now he was worried about something even bigger. He said how many big decisions, throughout history, were made by just a few people in power, right? Decision-makers who hadn't looked at their own thinking, hadn't controlled their emotions, "are likely to see the fate of entire societies changed by preventable errors on the part of their leaders."

Before the war, Daniel and Amos both wanted to take what they'd learned about human decision-making and apply it to high-stakes decisions. In this new area called "decision analysis," they could turn high-stakes decisions into, like, engineering problems. They'd design a decision system. Decision analysts would sit down with business leaders, military leaders, and government leaders, helping them analyze every decision, calculating the probabilities of different scenarios, and weighing the potential consequences. Like, if you wanna control a hurricane, there's a 50 percent chance you'll reduce the wind speed, but also a 5 percent chance you'll mislead the people who need to evacuate, right? What do you do? In negotiations, decision analysts would remind the people making big decisions not to be fooled by their gut feelings. Amos wrote in his lecture notes that their culture would be moving in the direction of being led by numerical formulas, and that would let the study of uncertainty have a place. Amos and Daniel both thought that the people most affected by high-stakes decisions, like voters and shareholders, would eventually understand the substance of decision-making better. They'd learn to judge a decision by the process, not the result. Decision-makers wouldn't have to be right all the time, they'd just have to figure out what could happen with each decision, and then deal with it responsibly. As Daniel said during a talk in Israel, what was really needed was "a cultural shift in attitudes toward uncertainty, and boldness."

But how decision analysts would actually persuade leaders in business, the military, or politics to take their advice, it was still unclear, right? How do you persuade a powerful decision-maker to define their "contribution" with numbers? Big people don't want other people digging around in their gut feelings, and even they don't want to look at those gut feelings. That was the challenge.

Later, Daniel looked back at the moment when he and Amos lost faith in decision analysis. Israeli intelligence had failed to predict the Yom Kippur attack, which caused, like, an earthquake inside the Israeli government, and everyone was looking at what had gone wrong, right? They'd won the war, but it felt like they'd lost. The Egyptians, who had suffered more, were celebrating in the streets like they were the victors. Everyone in Israel was trying to figure out where the system had failed. Before the war, even though there was lots of evidence that the Egyptians had been planning an attack, Israeli intelligence people still thought the Egyptians wouldn't attack as long as Israel kept its air superiority, right? But the Egyptians attacked anyway. After the war, the Israeli Foreign Ministry decided to create its own intelligence unit, and they hired Daniel to help, right? They started this carefully designed decision analysis operation. The basic idea was to measure rigor with new standards in dealing with national security. Daniel said they first wanted to get rid of the regular military reports, because intelligence reports were written like essays, and the worst thing about essays was you didn't enjoy reading them. Instead, Daniel wanted to give the Israeli leaders different possibilities in numbers.

In 1974, U.S. Secretary of State Henry Kissinger was trying to broker peace talks between Israel and Egypt, and Israel and Syria, right? To make it work, Kissinger gave the Israeli government the CIA's assessment, which said there could be, like, really bad consequences if the peace talks failed. Daniel and Lanir gave the Israeli Foreign Minister, Yigal Allon, precise numbers to show how likely certain bad things were to happen. They listed a bunch of "grave consequences," like the collapse of the government in Jordan, the U.S. recognizing the PLO, and a full-scale war between Israel and Syria again. Then, they interviewed experts and seasoned observers to confirm how likely these events were. The experts all agreed, they didn't disagree much. Daniel asked them how much the failure of Kissinger's peace efforts would increase the chances of a war between Syria and Israel. They all said something like, "It'll increase the chances by 10 percent."

Daniel and Lanir gave their assessment to the Israeli Foreign Ministry, right? They called it "The Nation's Gamble." Foreign Minister Allon looked at the numbers and said, "A 10 percent increase? That's not much."

Daniel was shocked, right? A 10 percent increase in the chances of a full-scale war between Israel and Syria because of Kissinger's failed peace efforts, that was unbelievable, right? If that increase didn't interest Allon, what would? Ten percent was, like, the most accurate probability they could come up with, but it was clear the Foreign Minister wasn't gonna accept it. He'd rather trust his gut. Daniel said that was the moment he decided to give up on decision analysis, because nobody was gonna make decisions based on numbers, right? People needed to understand the whole thing. Decades later, when the CIA asked them to describe their experience in decision analysis, Daniel and Lanir wrote that the Israeli Foreign Ministry didn't care about specific probabilities, right? If the people involved in the gamble didn't believe in probability analysis and didn't care about their chances of winning, what was the point of putting those probabilities on the table? Daniel wondered if it was because people didn't know enough about numbers to believe that numbers could reflect the problem, right? He thought everyone thought probabilities were nothing, they only existed in some people's heads.

In Daniel and Amos's lives, it was hard to separate their passion for ideas from their passion for each other, right? Looking back, their work together before and after the Yom Kippur War, it wasn't just about discussing different ideas, it was more like two people in love trying to find excuses to be together. They decided that they were done with studying how people made mistakes when they made intuitive judgments under uncertainty. As for decision analysis, they'd had high hopes for it, but they found it useless. They wanted to write a book about the different ways humans think when dealing with uncertainty, but they just kept getting stuck at the outline stage, even when they started writing a few chapters, they never finished them. After the Yom Kippur War, Israelis didn't trust the judgment of government officials, right? Daniel and Amos realized that what they really needed to do was fix the education system to teach the next generation of leaders how to think scientifically. In the book they never finished, they wrote that they'd taught people to watch out for the mistakes people easily make when reasoning, right? And they'd tried to share those ideas with politicians and the military, but it didn't work.

Adults easily trick themselves, but kids are different. Daniel created a course for elementary school students about judgment, and Amos created a similar course for high schoolers, and then they made a plan to publish it together. They wrote that they found the experience really inspiring, right? What would happen if they could teach Israeli kids to think scientifically, to recognize their wrong intuitive ideas and then fix them, right? Maybe, when those kids grew up, they'd realize how wise it was to invite Henry Kissinger back to broker peace between Israel and Syria. But they didn't keep working on that project, right? They were always more easily drawn to each other's ideas when trying to get the public's attention.

Amos invited Daniel to help solve a puzzle he had in psychology, right? How do people make decisions? Daniel remembered Amos telling him one day that they were done with studying judgment, they were gonna study decision-making.

Judgment and decision-making, just like judgment and prediction, the difference is blurry, right? But for Amos, and other mathematical psychologists, they're totally different areas. When people make judgments, they estimate probabilities. How likely is that guy to become a great NBA player, right? How risky is that subprime mortgage? Is that shadow on the X-ray a tumor? You don't have to make a decision after you make a judgment, but you have to make a judgment before you make a decision, right? The area of decision-making looks at what people do after they form some judgment, right? After they know the probabilities, or they think they know the probabilities, or they've judged the probabilities. Should you draft that player? Should you buy that bond? Should you have surgery or chemotherapy? This area is all about understanding what people do when they have risky options.

Students who study decision-making, they kind of ignore the real world, right? They just focus on these made-up experiments with clear probabilities that the test subjects do. The made-up scenarios in decision-making research are, like, the fruit flies in genetics, right? They're used to replace things in real life that you can't isolate. Amos gave Daniel a textbook on mathematical psychology to help Daniel get started, because Daniel knew nothing about this area. It was co-written by Amos, his teacher Clyde Coombs, and another student named Robin Dawes, right? Robin Dawes had worked with Daniel at the Oregon Research Institute to analyze Tom W's personality description, and he'd confidently answered "computer scientist," but he'd gotten it wrong. Amos told Daniel to read the long chapter on "individual decision making."

The book said that decision theory started in the early 18th century, right? French aristocrats who loved dice games asked the court mathematicians to help them figure out how to roll to win. In a gambling game, the expected value is the sum of all the outcomes, and it shows how likely each outcome is. If someone tells you that when you flip a coin, you'll get one hundred dollars if it lands heads up, and you'll lose fifty dollars if it lands tails up, the expected value of that gamble is $100 x 0.5 + (-)$50 x 0.5, right? It's either seventy-five dollars or twenty-five dollars. Maybe your rule for playing these games is that the expected value has to be positive, otherwise you won't play. But anyone can see that people don't always try to maximize the expected value when they bet, right? They also accept bets where the expected value is negative. Otherwise, why would casinos exist? When people buy insurance, they often pay more than their expected losses, otherwise how would insurance companies make money, right? To explain why a rational person would take a risk, any theory would at least have to think about some of people's regular needs, like buying insurance, right? And, in lots of cases, people can't maximize the expected value.

Amos's book said that the main idea of decision theory was first presented by a Swiss mathematician named Daniel Bernoulli in the 1730s. Bernoulli tried to add to people's behavior on top of just estimating the expected value, right? He said, "Suppose there is a poor man who happens to obtain a lottery ticket that will yield either nothing or twenty thousand ducats with equal probability. Will this man evaluate his chance of winning as equivalent to the possession of ten thousand ducats? Or, would he be willing to sell this opportunity for nine thousand ducats?" The answer is the poor guy sells it, right? To explain why the poor guy would rather have nine thousand ducats than take a chance to win twenty thousand, Bernoulli said there was a hidden assumption, right? People don't try to maximize value, he said, people try to maximize "utility."

What does "utility" mean to people? (This weird, gross word mainly means "the value that people give to money.") Of course, it depends on how much money a person had to start with, right? When a poor guy has a lottery ticket with an expected value of ten thousand ducats, he definitely thinks getting nine thousand ducats is more "useful."

Just saying that people choose what they want most isn't enough to predict human behavior, right? The "expected utility theory" that people talked about later is useful because it thinks about human nature. Bernoulli added that people have this "risk aversion" psychology, right? The textbook defined "risk aversion" as, "The more wealth a person already has, the less value that person places on each additional increment. Similarly, the utility of each increment of wealth decreases with the total amount." The second one thousand dollars you earn isn't as good as the first one, and the third isn't as good as the second, right? The money you use to buy fire insurance is worth less to you than the amount of money you'd lose if the house burned down. That's why you buy insurance, even though it's technically a bad deal. When you flip a coin, and you have a chance to win one thousand dollars, that one thousand dollars isn't as valuable to you as the one thousand dollars in your bank account that you could lose, so you turn down the game, right? A poor guy cares more about the nine thousand ducats he has, so he turns down a chance to bet for a bigger reward.

This doesn't mean that people in real life will always act this way, because of the things Bernoulli talked about, right? It just means that his theory seems to reflect how people treat money in the real world. It explains why people buy insurance, but obviously it doesn't explain why people buy lottery tickets. It successfully ignores anything to do with betting, right? It's interesting that the French wanted to explain how people make risky decisions, but they also taught people how to be smart gamblers.

After talking about Bernoulli's ideas, Amos's textbook skipped the long evolution of utility theory and went straight to 1944, right? That year, a Hungarian Jew named John von Neumann and an Austrian anti-Semite named Oskar Morgenstern somehow worked together to publish their ideas about "rational principles." They said that a rational person shouldn't violate the "transitivity rule" when choosing between two things, right? If he prefers A to B, and he prefers B to C, then he should prefer A to C. If someone chooses A over B, chooses B over C, but then chooses C over A, he's going against "expected utility theory." Among von Neumann and Morgenstern's other ideas, the most important, because of the possible consequences, was the "independence rule," right? According to this rule, you shouldn't let new, unrelated options affect your choice between two things. For example: You go into a deli and you want to buy a sandwich, and the owner says you can only have roast beef or roast turkey. You choose turkey, right? While he's making the sandwich, he looks up and says, "Oh yeah, I have ham, too." Then you say, "Okay, then I'll switch to roast beef." Von Neumann and Morgenstern's "independence rule" says that you're being irrational if you switch from turkey to beef because the deli owner has some ham in the kitchen, right?

Seriously, who would change their mind in that situation? The "independence rule," like the other rules about rationality, seems reasonable, and there's no obvious conflict between it and how people actually act in life, right?

"Expected utility theory" is just a theory, it doesn't explain or predict, and it doesn't tell us what people will do when they have risky choices, right? From Amos's textbook, Daniel didn't see why it was important, but Amos seemed to think it was important. Daniel said that Amos thought it was a sacred thing, right? Even though the theory didn't present itself as some amazing psychological truth, Amos's textbook clearly said it was important in psychology. It seemed like everyone who was interested in this topic, including everyone who worked in economics, thought the theory perfectly described how regular people make decisions when they have risky choices. That recognition had at least one good effect for economists, right? In the economic advice they gave to political leaders, everything was based on giving people more freedom to choose, and the market was put aside. Either way, if we're gonna expect humans to be fundamentally rational, the market should be rational, right?

Amos obviously didn't really believe this, and he'd been doubtful about it since he was a graduate student at the University of Michigan. Amos always had this instinct to find the flaws in other people's ideas, right? He definitely knew that you couldn't predict people's decisions with theory. He'd actually studied how people violated the "transitivity rule," right? Which went against the theory's assumptions. When he was a graduate student at Michigan, he repeatedly had undergrads from Harvard and convicted felons do these tests where they chose between A or B, B or C, and C or A, right? The test results proved that people didn't act in ways that matched expected utility theory. But Amos never really looked into that doubt too deeply, right? He'd seen people make mistakes sometimes, but he didn't see any consistent irrationality in how people made decisions. He also didn't know how to combine human insights into the mathematical study of human decision-making.

By the summer of 1973, Amos was thinking about challenging the dominant decision theory, the way he and Daniel had totally overturned the idea that human judgment followed statistical rules, right? He was traveling to Europe with his friend Paul Slovic, and he told him about his latest idea, to find a place for the complexities of human nature in decision theory. In a letter he wrote to a colleague in September of 1973, Slovic said that Amos had reminded him not to put "expected utility theory" and "choice models" against each other in empirical research, right? The problem was that utility theory was so widespread that it was hard to knock it down. Their strategy should be to go on the offensive and build their own case, not to argue with utility theory, but to add human limitations as a constraint to people's belief systems.

Amos had an expert on human limitations right next to him, right? Daniel. He called Daniel "the greatest living psychologist in the world," right? He hadn't really complimented Daniel that much before, right? (Daniel had said that men should be subtle.) He never explained to Daniel why he'd asked him to join in on decision theory research, a purely technical area that Daniel didn't care about and didn't know much about, right? It was a stretch to say that Amos had just made up an excuse to work with Daniel, right? It was more likely that Amos really wanted to see how Daniel would react to the mathematical psychology textbook. This moment was kind of like the scene in The Three Stooges where Larry plays "Pop Goes the Weasel," and Curly gets scared, right?

Daniel read the book Amos gave him like it was written in Martian, right? More accurately, he translated the book. He'd known for a long time that he wasn't good at applied math, but he could understand the logic in mathematical formulas. He knew he should respect and even admire those things, right? Amos's mathematical psychology was above it all, and the psychologists in that area looked down on the other areas of psychology. Daniel said that psychologists who knew math seemed to have magic powers, right? He said that they had status because they were surrounded by this giant aura of math, and the other psychologists had no idea what they were studying. Even when he was in social science, Daniel couldn't escape the overwhelming power of math, because it would be bad for him, right? But deep down, he didn't really value decision theory, and he didn't care about it. He cared about why people acted the way they did. And, in Daniel's opinion, the main ideas of decision theory didn't even explain how people made decisions, right?

Daniel must have been relieved when he was finally almost done reading the chapter Amos had written on expected utility theory, and he saw the sentence that said, "However, some people don't believe these rules."

The book went on to say that the "some people" were French economist Maurice Allais. Allais hated the smugness of the American psychologists, and he especially hated the growing trend in economics after von Neumann and Morgenstern had created their theory, right? To use mathematical models to show human behavior, and to see those models as a precise way to describe human decision-making. At an economics conference in 1953, Allais released his weapon against expected utility theory, right? He asked the audience to imagine what they'd choose in the following two situations (Allais's dollar amounts are multiplied by ten here, to account for inflation, right?):

Situation 1: You have to choose between these two options:

(1) You're guaranteed to get $5 million.

(2) You can gamble. There's an 89 percent chance you'll win $5 million.

There's a 10 percent chance you'll win $25 million.

There's a 1 percent chance you'll win nothing.

Most of the audience, including lots of American economists, said they'd definitely choose the first one, they'd take the guaranteed $5 million, right? Between definitely getting money and maybe getting more money, they chose the first one. Allais told them that was good, and now they should look at the second situation

Go Back Print Chapter