We were both at the same wedding last summer. Only one of us got married. Weddings are fun.

At least one of us learned that planning a wedding is stressful. In part, this is because designing one’s own special day means trying to anticipate the desires of one’s guests. Consider the cake selection. A chocolate and a white cake would be classic offerings. But the couple may wish to change things up by subbing out the white cake for a more unique flavor such as bananas foster. If so, how much of each kind of cake should they order? That is, how many guests will choose the more familiar dessert as opposed to the more unusual one?

If you’re like the participants in our studies, you’re not only likely to get this question wrong but you’re also likely to err in a predictable way. More specifically, you may find yourself with quite a few slices of leftover chocolate cake. No, our own research did not uncover a previously undiscovered aversion to chocolate. Instead, across eight studies and more than 1,400 participants, we documented what we call the commonness fallacy—a tendency to overestimate how much a relatively common option (such as chocolate cake) will be chosen over a relatively unusual but potentially more exciting one (such as bananas foster cake).

Why do people make this mistake? We find that, in predicting what other people will choose, forecasters (those trying to predict what others will choose) lean on how commonly each option has been chosen in the past. From what we see in stores, what we see on restaurant-goers’ plates, and what we have observed at previous weddings, chocolate desserts are everywhere. People eat a lot of them. And on its surface, such information seems relevant. People ignore others’ previous behavior at their peril: Past behavior, as a form of history, repeats itself.

But therein lies a problem. When people choose to eat chocolate cake, they are often constrained by what choices are available. Chocolate cake—like bouquets of roses, cruises to The Bahamas, and Coca-Cola—is commonly available. Bananas foster cake—like bouquets of snapdragons, cruises to The Galápagos, and Izze Sparkling Pomegranate Soda—is not. In other words, chocolate cake’s commonness does not mean that it is usually chosen over bananas foster cake. As a result, looking to people’s past choices to know which of two offerings they will select in the future can be misleading.

Let us be clear what the commonness fallacy does not mean: It is not that people think others always have a preference for common options over unusual ones. We suspect people likely realize that most other people would take a hamburger from an upscale restaurant instead of McDonald’s or a fine Swiss chocolate over a Hershey’s bar. But we do suspect that people would still overestimate just how often the more common option would be selected.

More generally, the commonness fallacy may yield insight into why society can get stuck in an unsatisfactory status quo. As the United States enters an election year, the chorus bemoaning the country’s two-party system is likely to swell. In modern U.S. history, a third-party presidential bid has not seriously gotten off the ground. One possibility is that there simply isn’t much support for such efforts, explaining why the resources have never quite amassed to catapult a third-party campaign into viability. But a second possibility is that people look to other people’s past behavior (that almost 19 in 20 American voters chose one of the two major-party presidential candidates in 2016) to decide who voters will choose in the future. The commonness fallacy can make people pessimistic about change. And when such pessimism tempers enthusiasm and discourages financial support for new options, people will continue to choose what is most prominently placed in front of them.

We observed evidence for the commonness fallacy across a wide range of domains—including forecasting people’s dinner food choices, vacation choices, and birthday celebration choices.  This suggests that the commonness fallacy is a fairly general phenomenon. Furthermore, in a number of studies, we made sure that people didn’t simply think others were more price-sensitive (as common items are often cheaper) than they themselves are.

Instead, the commonness fallacy emerges because people often replace the question “What will people choose?” with “How commonly is this chosen?” Getting people to think not simply about what others would choose but also what their preferences are (that is, what they actually like) reduces reliance on perceived commonness and, in turn, the commonness fallacy. In planning that wedding, our findings suggest that asking yourself, “What percentage of people would be more pleased to receive the chocolate cake or the bananas foster?” may be better than trying to forecast their choice directly. By framing the question this way, you can keep people from being led astray by the commonness of the options and help them focus on more predictive information. Doing so may help make your next wedding a bit more satisfying for your guests.


For Further Reading

Reit, E., & Critcher, C. R. (2020).  The commonness fallacy: Commonly chosen options have less choice appeal than people think. Journal of Personality and Social Psychology, 118, 1-21.
 

Emily Reit is a doctoral student at the Stanford Graduate School of Business.

Clayton Critcher is an associate professor of marketing, cognitive science, and psychology at the University of California, Berkeley.