Chapter Content

Calculating...

Okay, so, um, this is about, like, doctors and how they think, and, well, you know, their thinking can sometimes go wrong.

So, picture this, it’s summer, right? And a doctor, Donald Redelmeier, gets called in to see this young woman. She’s totally shaken up, because she was in a car crash, a head-on collision, just hours before. Ambulance rushed her to Sunnybrook Hospital. Turns out she’s got, like, a whole bunch of broken bones – ankle, foot, hip, face, you name it. Oh, and they missed the ribs at first. It wasn't until she was in surgery that they found out she had a heart problem too.

Sunnybrook is this big trauma center, right? Up in Toronto. It started out taking care of soldiers from World War II, but, you know, things changed. Then this huge highway, the 401, got built right next to the hospital in the 60s. And suddenly, Sunnybrook’s mission became patching up people from car crashes on this crazy busy highway. They got really good at it, and, you know, word got around. So, eventually, they started dealing with all sorts of trauma cases. You know, suicide attempts, injured police officers, falls, complicated pregnancies, construction accidents, even snowmobile crashes from up north. And a lot of these patients, they have more than just one thing wrong with them.

That brings us back to Redelmeier. He used to be a general practitioner, but then he specialized in internal medicine. At the trauma center, part of his job was to review diagnostic errors made by other doctors. Basically, he was checking how they were thinking. One of the hospital's epidemiologists, Rob Fowler, said that Redelmeier was all about, you know, probing how people think. He made you really think about things. People were often surprised after talking to him for the first time, like, "Who is this guy? Why is he giving me feedback?". But, they found him likeable, the second time they interacted. Redelmeier noticed doctors at Sunnybrook realized that it was important to examine other doctors' way of thinking, and this had changed a lot since the 80s when doctors often were very confident in their ability to make diagnoses. Now, the hospital wasn’t just a place to get fixed up; it was also a big machine for dealing with uncertainty. As Redelmeier put it, "Where there's uncertainty, there's judgment. And where there's judgment, there's the potential for error."

And get this – preventable accidents kill more hospital patients in North America than car crashes do! So Redelmeier would say that transferring injured patients without careful treatment could cause more trauma. And things like doctors not washing their hands, or even pushing elevator buttons can spread germs. He even co-authored a paper about how dirty those elevator buttons can be!

But the thing that really bothered Redelmeier was clinical misdiagnosis. Doctors, nurses, they're human. Sometimes they don't realize that the info patients give them might not be right. Patients might say they feel better even if they're not, right? And doctors can focus too much on the small stuff and miss the big picture. Jonz Pasca, a chief resident, said that Redelmeier taught him to notice what a patient's room looks like when they're gone, and if they're going to stay for a short amount of time. One time they went into a room and the patient was asleep, and Redelmeier stopped him from waking the patient, saying just observing would let him find the problem.

Another big problem is that doctors tend to look at things from their own perspective, their medical specialty. So a patient gets treated for one thing, but the doctor might not even know about some other hidden issue that could be more dangerous.

Back to the woman from the car crash, the one with all the broken bones. When she arrived, the doctors were mainly focused on the trauma from the accident. But then they noticed something else: her heart was going crazy, super fast, then almost stopping. It was clear something else was going on.

Soon after Redelmeier got called in, the doctors thought they had figured it out. The woman said she had hyperthyroidism, which can cause heart problems. So they thought, "Okay, that's it!" And, like, if Redelmeier had just gone along with it, no one would've questioned him. But he didn't. He told everyone to slow down, think it through, and make sure they weren't jumping to conclusions.

He said he was bothered by something. "Hyperthyroidism *can* cause an irregular heartbeat, but it's not the *most common* cause," he explained. The emergency room staff heard that she had hyperthyroidism, and concluded that this was what was causing the heart rhythm problems. No one actually thought to consider the statistics and see which factors are more likely to cause arrhythmia. Redelmeier says that doctors don't often use statistics. Redelmeier pointed out that doctors usually think probability doesn't apply to their patients.

So, Redelmeier asked them to think about the stats, and what *else* could be causing the irregular heartbeat. That's when they noticed her lungs were a mess. The X-ray couldn't even show her lungs because they were so damaged. Broken ribs can heal, but damaged lungs can be life-threatening. Redelmeier ignored the hyperthyroidism and focused on fixing her lungs. Her heart went back to normal almost right away. The next day, her hyperthyroidism test came back completely normal. Redelmeier said this was a classic example of, you know, jumping to conclusions. "When a diagnosis pops into your head and seems to explain everything, that's when you really need to watch out," he said. "You have to examine your thinking."

Not that the first idea is *always* wrong, of course. It just makes you too sure of yourself. "If you see someone come into the emergency room who’s agitated and has a history of alcoholism, you have to be careful," Redelmeier said. "Because you’re going to think, ‘Oh, they’re just drunk,’ and you’ll miss a subdural hematoma.” The doctors went straight to the diagnosis based on her medical history, ignoring the base rate. Daniel and Amos had pointed out that, unless you're completely certain, you need to consider the base rate when making diagnoses or predictions.

Redelmeier grew up in an old family home in Toronto. His dad was a stockbroker. He was the youngest of three boys, and he always felt like he wasn't as smart as his older brothers. He also had a bit of a stutter, and he worked really hard to overcome it. He even shortened his name to "Don Red" when he made phone calls to make it easier. His stutter made him speak slower, and dyslexia made him write slower. He wasn't very coordinated, and he started wearing glasses in fifth grade. But he was smart and had a really nice personality. He was great at math and helped the other kids. He was really considerate and caring.

Even in math, he always felt unsure of himself, like he was going to mess up. He says, "Sometimes you can feel the mistake coming," he says. "And even then, you still make it." He said this is probably why he was able to understand a certain difficult article. In the end of 1977, his favorite middle school teacher, Mr. Fleming, recommended that he read an article from *Science* magazine. That night, he sat at his desk at home and read the article.

The article was called "Judgment Under Uncertainty: Heuristics and Biases." These concepts were both familiar and strange, and Redelmeier thought he could understand them. The article described three ways that people make judgments under uncertainty. The authors named them representativeness, availability, and anchoring, which Redelmeier thought were strange. In the article, people were told to determine whether a man named Dick who was selected randomly was a lawyer or an engineer. Many of them were under the impression that Dick was 50% likely to be a lawyer and 50% likely to be an engineer. He also had an inkling that there were more English words that began with the letter K, not contained the letter K as the third letter. He also thought that he was able to give judgements about someone with great confidence, even though he wasn't very confident himself. He also mistakenly thought that 1x2x3x4x5x6x7x8 would have a smaller answer than 8x7x6x5x4x3x2x1.

What moved Redelmeier wasn't the notion that people make mistakes. Of course people make mistakes! The problem was that the mistakes are predictable and systematic. It appeared to be an inherent part of humanity. The article made Redelmeier remember all the mistakes he made in math. One section that he thought was particularly good was about "availability," and the role that imagination played in making mistakes. The authors wrote, "For example, in predicting the potential dangers of long-distance explorations, one may imagine various mishaps that the expedition is ill-equipped to handle. If such difficulties are vividly portrayed, their very imaginability may lend them undue weight in the decision. On the other hand, dangers that are difficult to conceive are likely to be underestimated."

This was not just about how many English words begin with K. This concerned life or death. "I'm a movie fan," said Redelmeier, "but the article moved me more than movies."

As for the authors—Daniel Kahneman and Amos Tversky—Redelmeier hadn't heard of them before. All he knew was that they were psychology professors from Hebrew University in Jerusalem. What Redelmeier cared about more was that his brothers had never heard of them. "Haha, finally I know someone they don't. I got them!" he thought. They were research studies that provided a secret look into the thought process. Reading their articles felt like standing behind a magician's curtain, watching the tricks.

Redelmeier never had much trouble deciding on what he was going to do for a career. When he was a kid, he liked watching doctors in shows like *Star Trek* and *M\*A\*S\*H*. “I was a bit of a hero worshiper,” he said. “But I couldn’t do it on the playing field, or in politics, or movies. Medicine was my only option.” At 19, when he was a sophomore, he applied to medical school. He entered Toronto University as a medical student a year later, just after turning 20.

That's when the problem came. The doctors in medical school didn't really resemble Dr. McCoy or Hawkeye. Most of them were arrogant, which made Redelmeier a little hostile. He recalled, "There were a lot of professors in medical school who would reach the wrong conclusion, and I couldn't say anything." They would say the wrong conclusion over and over again, as if it was a great truth. When faced with the same symptoms, doctors in different fields would diagnose differently. Professors that studied urology told them that hematuria means kidney cancer, while kidney professors told them it meant the development of nephritis (kidney inflammation). "They were both highly confident based on their specialist experiences," said Redelmeier. "They only saw what was related to their studies."

The issue wasn't with what they did or didn't know, but with their desire for certainty, or at least the appearance of certainty. They were more like preachers than teachers. "Their general characteristic was arrogance," Redelmeier said. "What do you mean you haven't used steroids??!!" From Redelmeier's perspective, those medical authorities didn't realize that there's a lot of uncertainty in medicine.

They were like this because accepting uncertainty means accepting that you are prone to mistakes. The whole medical industry was trying to act like the embodiment of wisdom. Doctors often attributed a patient's healing to their treatment program, even when there was no actual proof that the patient improved due to the program. “If the patient feels better after my treatment, that doesn’t necessarily mean they feel better *because* of my treatment," Redelmeier thought. “A lot of diseases can heal on their own," he said. "They disappear. People want treatment when they don't feel well, and the doctors feel the need to take action. Treating the patient with leeches to draw out the bad blood and the patient gets better, and now this patient is going to use leeches for the rest of their life, or they'll overuse antibiotics. Have an ear infection? Here, let's remove your tonsils. You took these steps and the patient improves the next day, and you can't resist continuing to take these steps. Your depression improves after seeing a psychiatrist, and it makes you think that psychological treatment works."

Redelmeier also noticed other problems. For example, professors often cared about the apparent meaning of the data, not the deeper meaning. An elderly man with pneumonia would come to the clinic and the doctor would check his heart rate. When they found out that his heart rate was at 75 bpm, they would administer the next steps in treatment. However, most elderly people with pneumonia would die because pneumonia is very infectious. After infection, their immune system would be weak and symptoms would arise: fever, cough, chills, lots of sputum, and increased heart rate. When the human body fights off disease, it needs to pump blood quickly through the heart into the body. "So the elderly man with pneumonia should *not* have a normal heart rate!" Redelmeier said. "It should be higher!" If an elderly patient with pneumonia has a normal heart rate, it means that his heart is having serious issues. However, most written material has made doctors think it is completely normal. And that it is when things look fine that medical experts often forget to "think three times."

By coincidence, a movement called "evidence-based medicine" was just beginning to arise in Toronto. The main idea was that intuitive judgements made by medical experts must be tested through clear data. Putting everything under the microscope, they realized that some things that were thought to be medical wisdom were actually deeply flawed. For example, when Redelmeier began medical school in 1980, the common method was to use medication to control the arrhythmia if a heart patient had arrhythmia. Seven years later, when Redelmeier graduated from medical school, researchers had discovered that the heart patient who received drugs due to arrhythmia actually had a greater rate of death than those who didn't. No one was sure why for so long, doctors had chosen a lethal form of treatment, even though advocates of "evidence-based medicine" were beginning to look for answers in the works of Kahneman and Tversky. However, it was clear that the intuitive judgements of doctors could be wrong: evidence gained during medical checks must be factored into making the diagnosis. Redelmeier was very sensitive to the evidence. "I began to realize that some things were hidden, that judgements were often invented by expert arguments," Redelmeier said. "I saw misdiagnoses that were the result of thought biases, and I realized that people didn't realize the mistakes they had made. I felt worried, and I felt like something was wrong."

At the end of the *Science* article, Kahneman and Tversky said that despite being experienced, experienced people could still make common mistakes. As they said, "Intuitive judgments can lead people to make similar errors when facing complex and unclear questions." Redelmeier thought this clever point explained why great doctors would still make mistakes. He reflected on the mistakes that he had made during his math exams. "The same problem exists in the medical field," he said. "When you solve math questions, you'll recheck every step. But you don't do that when you're a doctor. Math answers are constant. If we can make mistakes in a field like this, wouldn't that mean our chances of making mistakes increase by a lot in other fields where the answer isn't unique?" Making mistakes is human, and there's no need to be ashamed. "They described the areas in which people can trip when thinking and explained the relationships between those things. Now, we can talk about mistakes. They weren't negating mistakes, or making them out to be evil. They just made us realize that mistakes exist and are an inherent part of humanity."

However, as a young, unproven student, Redelmeier did not reveal his inner doubts. He never wanted to question authority or challenge convention, nor did he have the talent to do so. "I've never been shocked or disenchanted by anything," he said. "I'm usually very by-the-book. I obey the law and vote in elections. I'm never absent from staff meetings, and I've never argued with a police officer."

In 1985, Redelmeier entered Stanford University Hospital as a resident. There, he began to occasionally express his doubts about being a professional doctor. One night in his second year, he was sent to the ICU and tasked with prolonging a young patient's life in order to "harvest" their organs. The patient, who was only 21, was critically wounded after crashing his motorcycle into a tree and was officially brain dead.

This was the first time that Redelmeier had encountered someone younger than him on the brink of death. He had witnessed the deaths of old men, but he was never this distraught. "Life was cut off by disaster," he said. "If he had been wearing a helmet, it could have been avoided." The impact that humans can make deadly misjudgements on risk had a deep impact on Redelmeier. When making a judgement, you can obviously rely on an external force, such as asking someone to wear a helmet when riding a motorcycle. Redelmeier told this to one of his American students. "What do you Americans who adore freedom think about this?" he asked them. "Live in freedom or die happily? I don't want to live like this. I'd rather live with moderate restrictions." His student replied, "Not only do a lot of Americans disagree with you, but your medical peers don't agree with you either." His student told him that Stanford University's famous heart surgeon, Norm Shumway, was actively protesting a bill that would require motorcycle riders to wear helmets. "This was crazy," Redelmeier said. "How can someone so smart be so dumb about this? This further proves that people make mistakes. And we should be focusing on the fact that humans make mistakes."

At 27, Redelmeier's residency at Stanford University Hospital was up. Then, he began to start his own theory, which included the ideas of the two Israeli psychologists that he had known since he was a teenager. He wasn't sure about where his theory was going. He thought that after going back to Canada, he would go to Northern Labrador. When he was in medical school, he had done a summer internship where he provided healthcare for 500 residents in a small village. "I don't have a great memory, and I'm not super smart," he said. "I don't think I can be a great doctor. If I can't do something great, I'd rather go to a remote location with limited resources and contribute there." In fact, Redelmeier had always thought that his career would be that of a normal practitioner before he had met Amos Tversky.

Predicting thought biases and correcting them was a habit that Redelmeier made an effort to perform. He knew that memory was unreliable, so he always carried a notebook wherever he went, and he would write down his thoughts. Whenever he would be woken up at night with calls from the hospital, he would pretend that the phone connection was bad so that the resident on the phone would repeat their words more slowly. "You can't complain to the residents that their talking too fast. You have to blame yourself—this way they can understand more clearly." Whenever visitors would visit his office during work, he would set a timer to ensure he wouldn't forget about his patients. He would painstakingly review every mistake before any social events. He also fully reviewed the report hall when giving reports due to his speech impediment.

Time went on to the spring of 1988. It was a normal day for Redelmeier. Two days later, he would have lunch with Amos Tversky at the Stanford University Staff Club restaurant. To ensure he wouldn't be interrupted, he moved his check-up visit from 6:30 AM to 4:30 PM. He usually didn't eat breakfast, but he ate it that day to ensure he wouldn't be too busy eating that he wouldn't be able to focus on the lunch. In addition, he wrote notes with discussion topics to ensure there wouldn't be too many awkward silences. He wasn't planning on talking too much at lunch. His peer at Stanford University, Hal Soks, told him to talk less, and just sit and listen.

Amos' writing on medical topics was the result of a question he asked Soks: how could people's choice preferences when gambling affect the thinking of doctors and patients? Specifically, what would happen when people were faced with a certain profit and double the risk profit (i.e., either reliably earning $100 or taking a 50/50 chance on winning $200)? He had explained that people tend to choose the reliable profit. Soks and his two colleagues then created a series of tests to see if doctors and patients chose differently when faced with a certain loss instead of a certain profit.

Lung cancer was obviously a case that could prove this point. In the early 1980s, doctors and patients had two options for fighting lung cancer: surgery or chemotherapy. Surgery was more likely to prolong their lives, but unlike chemotherapy, there was a small risk that surgery could lead to immediate death. When doctors told their patients that they had a 90% chance of survival after surgery, 82% of them would choose surgery. But, if the doctor told them that they had a 10% chance of dying, only 54% of the patients would choose surgery. This was because when making life or death decisions, people's focus isn't on the probability itself, but on the way it's presented. The patient was like this, and so was the doctor. Soks said that his views on his career had changed since working with Amos. "Cognitive issues have never been brought up in the medical field," he said. He couldn't help but think about how many doctors had intentionally emphasized the 90% chance of survival and not the 10% chance of death when informing their patients of the risk, because he preferred to do surgery himself.

During this lunch, Redelmeier was a faithful listener of what Soks and Amos had to say. Still, he noted some things. Amos had gray-blue eyes, and he stuttered when he spoke. He spoke English very fluently, but had a strong Israeli accent. "He was a bit too alert," said Redelmeier. "He was very active and full of energy. For 90% of the time, he would be the one talking, and everything he said was worth listening to. He didn't know much about medicine, but he had a major impact on medical decision-making, which shocked me." Amos asked the doctors all sorts of questions, mostly concerning illogical medical behavior.

When the lunch was coming to an end, Amos invited Redelmeier to sit at his office. Amos would throw human psychology theories to Redelmeier and ask him to find a medical equivalent. Amos explained that many people would refuse to bet when there was a 50% chance they could win $150 and a 50% chance that they could lose $100. However, if the same people were to make the same bet 100 times, most of them would make the bet. After explaining the paradox, Amos said, "Come on Redelmeier, tell me if there's any similar problems in the medical field!"

Redelmeier soon had an answer. "Regardless of what the other fields are like, there are a lot of similar examples in the medical field. Surprisingly, Amos stopped me and carefully listened to what I had to say." He said that doctors had two responsibilities: one towards the patient and one towards the community. Doctors could only see one patient at a time, but in his role as a medical policy maker, he was tasked with the health of the entire community.

These two roles were also conflicting. For example, the safest method would be to use antibiotics when treating a patient, but the overall effect of overusing antibiotics could lead to an unmanageable disaster. A responsible doctor wouldn't just consider the profits of a single patient, but the profits of all patients with similar diseases. The doctors repeat the same choices as if they were bets over and over. Do they behave differently when they have to repeat the same choice as when they are only tasked with making a one-time decision?

Amos and Redelmeier said in their co-written article, "The Medical Decision Differences for an Individual Versus the Collective" (released in the April 1990 issue of the *New England Journal of Medicine*) that the methods that doctors do for individuals were different than the standard treatment plans made for groups with similar diseases. He would ask patients to do some additional check-ups to minimize problems, but wouldn't actively ask if the patients are willing to donate their organs after death. He said, "The result proves again that there is a conflict between the patient's interest and the interest of the entire community. We need to take action against this inconsistency. Choosing the treatment plan for individual cases and choosing a different plan for groups is unreasonable."

The issue wasn't with whether or not the doctor's treatment was accurate for a single patient. The issue was that the doctor can't make two different plans for an individual with a disease and a group with the same disease. In any case, Redelmeier wasn't all too distraught. The doctors who responded to the *New England Medical Journal* felt similarly. Redelmeier said, "Most doctors always strive to be rational, scientific, and logical, but that's a giant lie, or at least partially a lie. We are led to believe the lie due to our wishes, dreams, and feelings."

Their first collaborative article was a firm foundation for later collaborations. Before long, they went to Amos' house after work, and they would talk late at night. Working with Amos could hardly be called work. "It was purely fun," said Redelmeier. In his heart, Redelmeier knew that Amos would affect his life. The important ideas seemed to be from Amos, and he just needed someone to tie those ideas with medical procedures. As a result, Redelmeier didn't feel like he contributed that much. He said, "In many ways, I feel like I'm a good secretary, and it bothered me for many years. I feel like I'm unnecessary. After going back to Toronto, I was wondering whether I had made a contribution, or if it all depended on Amos."

However, just a few years ago, he had been planning to work in a small village in Northern Labrador as a normal practitioner. Now, he had an incredibly specific ambition: to explore the thinking biases of doctors and patients as a doctor and researcher. He wanted to combine cognitive psychology with medicine to analyze judgement-based decisions. He was still not that confident. The only thing he knew for sure was that his collaboration with Amos had unlocked something new inside him: a person tasked with exploring the truth. He wanted to use data to find the constant models of human behavior in order to get around the thought blind spots that determine human life and death. In regards to the other person inside him, he had said, "I didn't actually realize that I had this ability. Amos didn't realize it either, but he awakened it. He was like a messenger, bringing me to a land that he wouldn't be able to go to himself."

At the time, Daniel himself appeared. At the end of 1988, or maybe the beginning of 1989, Amos introduced the two people. Later, Daniel reached out to Redelmeier and said that he also wanted to study the decision-making process of doctors and patients. It turned out that Daniel had some unique opinions. "When he called me, Daniel was working alone," said Redelmeier. "He wanted to introduce another heuristic, one that he invented and was not from Amos. It was already the fourth heuristic."

During the summer of 1982, the third year that Daniel was working as a professor at the University of British Columbia, he entered the laboratory and announced a decision that shocked his graduate students: from now on, they would be studying happiness. He began to question whether people could foresee the emotions that they would have after an event. What was the difference between the happiness that people thought they would get and the happiness that people actually got?

Daniel had his test subjects predict the amount of happiness they would experience after they ate a bowl of ice cream and listened to their favorite music every day for a week. Next, he would compare the predicted happiness with the actual happiness they had felt. Then, he would compare their feelings of happiness and their memories of happiness. Obviously, there were undiscovered differences between them. When their favorite team wins the championship, they feel happiness. Six months later, it doesn't matter that much. The research students thought it was surprising that someone who didn't experience happiness would be revealing the laws of happiness.

When he introduced himself to Redelmeier, Daniel had already left the University of British Columbia and gone to the University of California, Berkeley. His research interests had also shifted from happiness to pain. He was looking into the differences between the pain people expect and the pain they experience, and the difference between the pain people experience and the pain they remember. If the predictions about pain don't match the pain that's felt, what does that mean? Daniel thought there was a deeper meaning. People say that vacations are a mix of good and bad, but when people get back home, they only focus on the good. A sweet relationship ends in pain, so the person can only remember the pain that was felt. People aren't really feeling so much happiness or pain. What people experience and what people remember are just different things.

When he met Redelmeier, Daniel had already begun his studies in his laboratory at Berkeley. He had test subjects put their arm into a bucket of cold water, had them experience two painful experiences, and then asked them which experience they would repeat. Those being tested remembered the most painful points, especially the pain during the end. If you immerse the subject's arm in cold water for 3 minutes, and then raise the water temperature for a minute before removing the arm, they will feel that it was better than just removing the arm after 3 minutes of pain. So even if a painful experience is a bit longer, it will be more bearable if the ending is better.

Daniel wanted Redelmeier to provide a medical equivalent of his study. Redelmeier soon thought of colonoscopies. In the 1980s, patients were afraid of colonoscopies. The procedure was so uncomfortable that no one would want to experience it a second time. By 1990, 60,000 patients in America were dying due to colon cancer. If they had been checked earlier, a lot of the patients could have been cured.

To solve the problem, Redelmeier spent a year testing on almost 700 people. When examining one group of patients, the doctor would directly remove the endoscope from the patient after the test. The other group of patients, the doctor would leave the endoscope in the patient's rectum for about three minutes. The extra three minutes wouldn't make the patients feel better, but the pain wouldn't be like the pain before. The first group was just a normal test, while the second group got a slightly better ending. However, they felt more pain since they felt the same pain as the other group, plus another three minutes of suffering.

One hour after the test, the test administrators asked the patients what they thought of their experience. The data indicated that the second group—those who had a better ending—actually felt less pain. More interestingly, the second group was more willing to have a second colonoscopy. Redelmeier had said, "The impression during the last minute may remain forever."

Working with Daniel was different than working with Amos. In Redelmeier's mind, Amos had a very distinct personality. Daniel, however, had a very complex personality. Daniel rarely showed happiness, and sometimes he looked a bit depressed. He would feel pain because of his work, which would make others around him feel pain as well. "He liked finding issues with his work, not finding the good parts," said Redelmeier. Yet, the ideas that he said were still great.

When he quieted down and thought about it, Redelmeier found it strange that he didn't know much about the past of Amos and Daniel. Redelmeier said, "Amos rarely told me about his personal life. Not Israel, not the war, not the past. He wasn't trying to evade, but because of time limits, he was doing what he needed to do." They were working together to analyze human behaviors in medicine. Redelmeier never thought to ask about Amos and Daniel's past, why they left Hebrew University, and why they left Israel to come to North America. He also never found out why Amos was a visiting professor at Stanford University during the 1980s, while Daniel was mostly just an unremarkable professor at the University of British Columbia. They looked friendly, but they obviously weren't collaborating. He didn't know why. "They wouldn't even talk about each other," he said.

It seemed that they had concluded that they would get more if they were working alone. The two of them were using their accomplishments in the past in real life in different ways. "I think they're like a pair of good friends, and I'm just their schnauzer," said Redelmeier.

In 1992, Redelmeier went back to Toronto. His meeting with Amos changed his life. Amos's mind was incredibly imaginative. All you had to do was see how he resolved problems. However, it seemed that Amos had invented all of his ideas, and he just needed someone to tie those ideas together with medicine. In that vein, Redelmeier felt like he didn't really contribute. He said, "In a lot of ways, I was like a good secretary, and I had this feeling for many years. In my heart, I felt like I was unnecessary. After going back to Toronto, I was thinking whether I played a part, or if it all depended on Amos."

Just a few years ago, he had been planning to live the life of a normal medical practitioner in a small village in Northern Labrador. Now, he had a very specific ambition: to explore the thinking biases of doctors and patients as a researcher and doctor. He wanted to combine the cognitive psychology that Daniel and Amos studied and use that to analyze judgement decisions in medicine. As for the specifics, he hadn't thought of those yet. He still wasn't confident. The only thing he was sure about was that his collaboration with Amos had unlocked another side of him: someone tasked with seeking the truth. He wanted to use the data to explore the human behaviors in order to avoid those thought biases that determine human life and death. Regarding his other side, Redelmeier said, "I didn't realize I had that ability. Amos didn't see that either, but he awakened me. He was like a messenger, sending me to an unexplored field."

Go Back Print Chapter