Chapter Content

Calculating...

Okay, so, um, wow, where to even begin? So, my book, "Moneyball," came out... and it was basically about how the Oakland A's, you know, that baseball team, they were trying some new, like, beneficial approaches to figuring out, you know, how much baseball players were worth and like, how to strategize during games. They were different, right? They didn't have a ton of money to just, like, buy all the super expensive players, so management had to, like, think outside the box. By really digging into the historical data and the newer data, and having these outside people do, like, statistical analysis, they kinda stumbled onto this, like, totally new way of understanding baseball. And from that point on, they started showing up in other baseball teams' offices, and what they found was that the players that were being ignored or, like, thrown away, actually had a ton of value, which, you know, was pretty ironic considering, like, the conventional wisdom about how to pick players at the time.

Now, when the book came out, some of the baseball experts, you know, the old-school managers and scouts and reporters, they, like, kind of scoffed at it. But, most readers, I think, found the story interesting, you know? Like, the way the A's chose players, it really resonated with a lot of people. It highlighted this common experience, like, if employees at some company who are paid a ton of money and are, you know, selected through all these public interviews aren't even valued properly, then like, who is? If the baseball player market is so inefficient, then what market *is* efficient? And, um, if this new method of analysis can, like, totally change how we see baseball, then like, what other areas of human activity are just wide open for us to explore?

And, you know, over the past ten-ish years, the A's example has been seen as, like, a model, right? People started using more comprehensive data and more scientific methods to analyze things and find inefficiencies in markets. I've read articles about, like, how the "Moneyball" principles have been used in all sorts of different areas, like, how to run schools, how to make movies, how to manage health insurance, how to play golf better, how to farm, how to run a publishing house, how to prepare for a presidential campaign, how to make government better, even how to be a better banker! I mean, it was wild. "Are we going to 'Moneyball' our offensive line overnight?" a New York Jets offensive line coach grumbled at one point. And even comedian John Oliver "congratulated" the North Carolina legislature for "successfully using 'Moneyball' on race issues" because they were, like, using data analysis in a really sneaky way to pass laws that made it harder for African Americans to vote.

But, you know, people aren't always super eager to replace traditional experience with, like, new-school data analysis. And when that data-driven method is used in high-stakes decision-making and doesn't get bigger results in the short-term, it gets a lot of criticism, like, way more than traditional methods would. So, like, after the Boston Red Sox copied the A's, they won the World Series for the first time in, like, a hundred years, right? Then they won again a few years later. But then, like, in 2016, after a few bad seasons, they announced they were ditching the data analysis and going back to trusting the baseball pros. The team owner even said, "Maybe we relied too much on data..."

And, like, Nate Silver, that guy who used his data-crunching skills from baseball to write about and predict presidential elections for the New York Times, he was, like, incredibly successful for a few years. It was the first time a newspaper was, like, calling the elections so well, you know? But then, he left the Times, and he *didn't* predict Donald Trump winning. And all of a sudden, his election prediction system was being questioned, and it was being questioned, like, *by* the New York Times! One of their columnists said that "Politics is a fundamentally human realm and therefore transcends rationality and cannot be predicted. Given this, there is no substitute for on-the-ground reporting," or something like that. Although, I mean, the on-the-ground reporters didn't really see Trump coming either. And Silver also admitted later that his predictions had some subjective, maybe not-so-good, judgment mixed in, because Trump seemed, well, different.

I think there's some truth to the criticism of, you know, people who use data to exploit industry inefficiencies. But I also think that this desire to just find some expert who knows everything is always going to win out, even when there's no real reason to think they'll be successful. It's like the movie monster who just refuses to die until the very end.

So, in all the discussions that came up after my old book, there was this one comment that really stood out to me. It was a comment from these two guys from the University of Chicago: an economist named Richard Thaler, and a law professor named Cass Sunstein. They wrote this article that was, like, a mixed bag of praise and criticism. They agreed that it was, like, really interesting that a small team like the A's could beat the big teams by exploiting market inefficiencies, and that this shook up the whole market for professional athletes. But then, they also said that I didn't really seem to get the deeper reasons *why* those inefficiencies existed in the baseball player market. They argued it was all tied to how our minds work. Some years before, these two Israeli psychologists had analyzed how baseball experts were making mistakes when they were judging players, and how their thinking processes caused those mistakes. They were Daniel Kahneman and Amos Tversky. I mean, the ideas in "Moneyball" weren't really mine, I was just sharing some stuff that had been around for decades that people hadn't really understood.

And on top of that? I had never heard of either of those two guys, even though one of them, apparently, had won the Nobel Prize in Economics. And I hadn't really thought about "Moneyball" from a psychological perspective. Why was the baseball player market so inefficient? The A's office had said it was because of market biases, like, focusing too much on running speed and, like, underestimating a hitter's ability to walk, because running is easy to see, walking is not. Or that players who were, like, fat or ugly were undervalued, and that players who were, you know, athletic or handsome were overvalued. I thought those biases were interesting, but I didn't really think about *where* those biases came from, or *why* people had them. I was just trying to tell a story about how a market worked, especially about how the market succeeded or failed at valuing players. But there was another story hiding in there, a story that I hadn't really gotten into or told, about how our thinking helps or hurts us when we're making judgments and decisions. How our thinking guides us when we're trying to invest, hire people, or pursue some goal? How does our thinking process the data, whether it comes from a baseball game, an income report, a tryout, a physical exam, or, like, a speed-dating event? What is the brain doing in those moments, even the brains of so-called experts, that lets other people, the data-driven people, take advantage of them?

So, why were those two Israeli psychologists so interested in these questions that they might one day, like, write a book about American baseball decades later? What made these scientists in the Middle East sit down and study what thinking does when we're judging a baseball player, deciding whether to invest, or thinking about a presidential candidate? And, like, why did a psychologist win the Nobel Prize in Economics? Well, that's what I'm going to try and explain.

Go Back Print Chapter