It is illegal in the United States, and most other countries, to pay people for their organs.

It is not illegal, though, to pay people to eat bugs. So that is what Sandro Ambuehl, a doctoral candidate in economics at Stanford, recently did, to show how money can make you a little crazy.

Medical ethicists say that allowing payment for body parts may force people into doing things they don’t want to do.  As one government report puts it, “financial need should not be linked in a coercive way to giving consent for organ procurement.” Classic economic theory has always been skeptical of this argument. How does it harm you if someone offers you a chance to make money? You could always turn the offer down, and be no worse off for it.

But Ambuehl argues that financial incentives can warp your perceptions of the world. His research illustrates how people behave irrationally once money is on the line. Some literally start to see reality differently.

“Potentially, financial incentives can make people worse off because you change what they believe they are getting into,” Ambuehl said in an interview.

In one of his experiments, Ambuehl randomly offered hundreds of people either $3 or $30 to eat from a menu of insects. These were not cute bugs. These were large, hairy crickets and silkworms. For real. Here are his pictures:


The twist is that Ambuehl gave some people the option of watching some YouTube videos about bug-eating. Some of the videos argued that bug-eating was fun and delicious, and a great source of protein. Other videos stressed that eating bugs was disgusting, describing their guts and egg sacs in sickening detail.

People offered the $30 were more likely to choose videos that praised bug-eating. Not only that, but the effect of the pro-bug-eating video appeared to be stronger for the people offered $30 compared to the people offer $3. Those who had more money on the line, it seems, paid closer attention to the message.

This experiment illustrates one way money can change people’s minds: by causing them to seek out certain kinds of information, and perhaps perceiving that information differently.

When the offer is enticing enough, people will try to persuade themselves to accept the deal, Ambuehl says.

This becomes a problem if people delude themselves by focusing on the positive aspects of their choice. “There's a danger,” Ambuehl says, “that if you pay people, they will underestimate risks because they don’t want to hear about them.”

There’s more. In a separate, and more bizarre experiment, Ambuehl showed that money might even cause people to see things that aren’t really there.

Ambuehl told participants that they were going to see a grid of letters in which there either more Gs or more Bs. He asked the participants if they would take a bet with him: If there were more Gs, they won. If there were more Bs, they lost.


Participants had a 50-50 chance of encounting either scenario. But it didn't really matter because they had unlimited time to count the letters. If they counted more Bs than Gs, they could reject the bet. Nobody should have lost any money.

Yet Ambuehl was able to manipulate participants so that some routinely — and irrationally — lost. He did this by tweaking the terms of the bets and how people learned about them.

People were given either favorable or unfavorable betting terms. The favorable terms: If they won, they got $3. If they lost, they paid 50 cents. The unfavorable terms were the reverse: If they won they only got 50 cents. If they lost, they had to pay $3.

Some participants first learned their betting terms, then saw the letters. Others were shown the letters first, then told the terms of the bet.

In other words, some people had the terms of the bet weighing on their mind while looking at the puzzle, and others didn't.

Why is this important? Again, in theory, none of this should have made much of a difference because people were able to look at the grid as long as they wanted before deciding to take the bet.

Instead, Ambuehl showed that when people were presented with the favorable betting terms, they were far likelier to be irrationally optimistic. When presented with losing grids of letters, where Bs outnumbered Gs, they would somehow miscount.

Perhaps this was because the favorable terms encouraged people to guess more. When you only stand to lose 50 cents, it's easy to get lazy and not count all the letters.

But that doesn't explain why the foolish optimism was strongest among people who were offered the favorable terms prior to seeing the letters. These people made more mistakes— more losing bets — than others who were offered the same favorable terms after seeing the letters.

Apparently, the knowledge of the favorable terms was poisonous, interfering with how people saw the grid of letters. It was almost like they subconsciously wanted to take the bet, and their minds were closed to evidence that they shouldn’t do it — almost like they kept seeing Gs even though there were more Bs.

As in the bug-eating experiment, the letter search game showed that having financial incentives on your mind can distort how you perceive the world. You might seek out information to justify yourself, or even ignore contrary information that is staring you in the face.

It’s important to think about these issues in the context of medical procedures. There is great controversy over whether it is ethical to pay people for their organs, for instance, or to pay women to be surrogates. One concern is that desperate people might not fully understand the risks of donating a kidney, or of carrying a baby to term. It is the doctor's job to present them with enough data for them to make an informed decision.

Ambuehl's research suggests that you can give people information, but that doesn't necessarily mean they will be informed. Biases start to take over when there is money on the line. A potential kidney donor might not listen very closely while the doctor is talking because she's focused on the upsides.

Another example that Ambuehl brings up is military recruitment. Though it may be illegal to pay people to risk their lives becoming organ donor, we pay men and women to risk their lives defending the country in the military. The size and strength of a volunteer army depends on how much you pay recruits. The U.S. Army has used enlistment bonuses — up to tens of thousands of dollars — to bring people in.

That kind of money gets people in the door, after which potential recruits are given more information about the risks and rewards of joining. The money might also shape how they understand the Army's pitch, and may motivate them to seek out more positive information about military careers. Ambuehl's research shows that cash can convince people to convince themselves.

It might also be possible to detect this dynamic in politics. Politicians are often accused of ethical sketchiness, of being in the pocket of lobbyists and big donors. But as Ambuehl demonstrated, part of the reality-bending power of money is that it motivates people to seek out certain kinds of self-serving information. A politician might genuinely believe she is making the right vote, without realizing how the financial incentives subconsciously guided her there.

In the case of bug-eating at least, Ambuehl was likely making a positive change in the world. Environmentalists have said that we should perhaps eat less meat and more bugs.

But you could imagine many more sinister situations, in which financial incentives cause people to delude themselves to their own ruin.