The Newcomb Paradox

Or, Humility in the Face of Weirdness 

You’re on an alien spaceship orbiting the planet Newcomb. Don’t worry—the air is perfectly breathable. They’ve even got magazines in the waiting room.

As part of their research into human behavior, the aliens have placed two boxes in front of you: a transparent box containing $1000, and an opaque box, whose contents remain a mystery.

20150326082040_00009

You’re allowed two options: take both boxes, or take just the opaque box.

20150326082040_00010

It seems obvious. You’d be crazy to leave the guaranteed $1000 behind, right?

Not so fast. The aliens have observed humans for decades and have run this experiment thousands of times—so they’re very, very good at predicting which people will settle for the opaque box, and which people will greedily snatch both. And that’s where things get tricky. The aliens hand you a note:

20150326082040_00011

We made a prediction about your behavior. We won’t tell you what we predicted, but we will tell you that in this experiment, we fill the opaque box based on our prediction for each human.

If we predict the human will take only the opaque box, we put $1 million inside (to reward the human’s restraint).

But if we predict that the human will take both boxes, then we leave the opaque box empty (to punish the human’s greed).

We made your prediction earlier today, placed the correct amount in the opaque box, and sealed it. The choice is now yours. Will you take both boxes, or will you take only the opaque?

20150326082040_00012

The classic answer to this question—from mathematicians, economists, and other rational-minded folks—is to take both boxes. After all, your actions now cannot affect what the aliens have already done. No matter what’s in the opaque box, you’ll be $1000 better off if you take the transparent box, too. So you might as well take it.

I’m not so sure.

This is a hypothetical world that contains semi-psychic aliens and million-dollar boxes. It seems to me that the humble thing to do is to respect our lack of understanding about extraterrestrial matters and leave the transparent box behind, rather than trying to squeeze an extra $1000 out of an already quite generous alien. (Besides—if the aliens are observing you right now, and you commit to taking both boxes, they’ll know, and won’t give you the million.)

20150326082040_00013

Such alien game-show scenarios are admittedly hypothetical. (Though if you’re observing me, aliens, I’m sorry I called you hypothetical.) Even so, we often encounter systems and situations in life that are nearly as foreign to our experience and intuition. Sometimes, we respond to these bizarre or unfamiliar scenarios with the greedy profit-maximizing approach of an economist or a Ferengi from Star Trek. But these are precisely the situations where a little caution and humility go a long way.

Given a morsel of information, we often try to scheme and optimize. We rarely consider the 4000 facts that we don’t have. To quote the satiric film In the Loop: “In the land of truth, the man with one fact is king.” We ought to stop—or at least take a brief pause—to consider the multitude of things we don’t know, and the distinct possibility that our entire framework is wrongheaded.

I say leave the transparent box. Will you really regret leaving $1000 on the table, when you’ve got $1 million to play around with?

20 thoughts on “The Newcomb Paradox

  1. Donald Knuth (of TeX and “Art of Computer Programming” fame) said “If you optimize everything, you will always be unhappy.”

    1. Thanks to the author for sharing yet another great blog and thanks for this comment with the quote. This quote adds a necessary dimension when describing my dissatisfaction with trying to teach students in an era of standards and high-stakes standardized testing.

  2. “The aliens have observed humans for decades and have run this experiment thousands of times—so they’re very, very good at predicting which people will settle for the opaque box, and which people will greedily snatch both.”
    – It doesn’t necessarily follow that because an experiment was repeated many times, the outcome becomes predictable to such a degree (fliiping a coin for example). Without data to substantiate this claim, the participant is no better informed (if not misinformed) of the alien’s ability to predict the outcome.

    “your actions now cannot affect what the aliens have already done.” – True.

    “…the humble thing to do is to respect our lack of understanding about extraterrestrial matters and leave the transparent box behind…”
    – This would be anything but humble – more like credulous. The best way to “respect” our lack of understanding (ignorance) about something is to say “I don’t know” and adjust beliefs (or lack thereof) accordingly, and make decisions with the best information we DO have….and take both boxes.

    In the end it’s just a silly game that CLAIMS to reward people for their credulity, making it somewhat contemptible.

    1. I would be happy to take the opaque box: either I prove to the aliens that they’re not as good at predicting my behaviour as they think, or I get a M$ 😉

      In any case, I don’t buy the orthodox view that it’s rationally optimal to take both boxes. A fairly simple piece of probabilistic reasoning is all it takes to see why that’s wrong.

      We have prizes k, M; the opaque box contains 0 if they expect you to take it, M if they don’t.
      Suppose their ability to predict your choice has probability p of being correct.
      If you take both boxes: if they predicted that, you get k, else you get k+M; so your expected winnings are: p.k +(1-p).(k+M) = k +(1-p).M.
      If you take only the opaque box: if they predicted that, you get M, else you get 0, so your expected winnings are p.M.
      Taking both wins if k +(1-p).M > p.M, i.e. if p < (1 +k/M)/2.
      For the puzzle as given, that's if p < .5005; as long as M is much bigger than k, the threshold is close above a half.

      So, if you think their ability to anticipate your choice is appreciably better than a coin-toss, even if only by k parts in M, then taking the opaque box is a winning strategy.

      You might think: that means, assuming they know you've read this, that you're sure to follow the winning strategy and take only the opaque box; so you can be sure there's a M prise in it; but taking the k prize as well can't hurt, surely ?
      However, if they're capable of distinguishing those who think that way from those of us happy to leave k as long as we're getting M (and to prove them wrong if they misjudged us), even if that capability is only marginally more reliable than a coin toss, then it's best to ignore that train of thought.

  3. Is this supposed to be an April Fools post? It seems like an important question to me.

    Have you read Hofstadter’s chapter on the prisoner’s dilemma in Metamagical Themas?

    The experimental testing of game theory and scenarios like the prisoner’s dilemma and the ultimatum game is interesting too. Here’s economist Pete Lunn in his book Basic Instincts, Human Nature and the New Economics: “It is no exaggeration to say that orthodox economics is based on the idea that people can be treated, for economic purposes, as if they are selfish, independent calculating machines. Yet behavioural economics is busy proving that, in reality, people do not behave like selfish, independent calculating machines; in many cases we do not even come close to behaving this way. This amounts to a classic scientific refutation…”

  4. The issue is this:
    Taking one box *strongly dominates* taking two: whatever the actual state of the world, you are better off taking one box than taking two. In general, for any two options A and B, if A strongly dominates B, it is rational to take A.
    BUT
    Taking two boxes has, on average, a higher *expected return* than taking two: based on our best statistical modeling, we recognize that if the situation is as described (forget the motivation of the aliens and focus on the *fact* that if they predict you take one box, they put $1m in it and otherwise, they leave it empty, AND that they are *very good* at predicting these things!), one will, on average, do better taking two boxes than taking one. In general, where A has a higher expected payoff than B, it is rational to take to take A. (The above-linked blog post focuses on this aspect.)

    The problem is that these are *both* legitimate rules for rational decision-making, and they recommend different courses of action in this case. 🙂

  5. Maybe the aliens are good at predicting because they have time travel, or because you’re actually a simulation they’ve devised. In these cases, by just taking the opaque box you guarantee that your “real” self will be able to reap the benefits of your restraint.

  6. In the canonical set-up, you have very good evidence that the “predictor” (whatever it is) is very very good at predicting what people will in fact choose, and has been very successful at this in the past. Without that evidence, than, sure, there isn’t much reason to take 1-box. 🙂

    A few more thoughts, for those people interested in this problem and its relationship to other classic problems in decision-theory and game-theory:

    If you take the predictor’s actions to be based on your *intentions* (note that the way Ben sets up the problem — and indeed in the canonical form of the problem — this can’t be quite right, because the predictor has already made the prediction *before* you are even told about the problem! So the predictors’ prediction can’t be based on what you intended, at least in the ordinary sense…), then the problem has some similarities with Greg Kavka’s classic “The Toxin Puzzle” (https://www.law.upenn.edu/live/files/1298-the-toxin-puzzle-kavka). Intending to do x results in a higher pay-off than not-so-intending, but there is no reason to actually do x, and some good reason not to do x; under those circumstances, it is unclear that it is possible to be rational and to intend to do x, despite so-intending having the higher pay-off. (BTW: I think everyone should read Kavka’s “The Toxin Puzzle” — it is a great piece.)

    If you take the problem to be one of “commitment” — committing to only taking one box in advance — then the problem might be considered part of the problem of commitment more generally. In these problems, you are better off if you can commit to doing x, but you know that you’ll have good reasons not to do x when the time comes to do x, and therefore you have a problem with committing to doing x; it may be impossible to be fully rational, in some sense of the term, and to so-commit.

    Simon’s note, above, regarding how people really behave versus how they would be predicted to behave, is very much on-point with respect to questions involving commitment. In general, I think it is safe to say that people seem better able to commit, in the sense above, than would be predicted if they were simply rational-choice machines. And sometimes that can result in their doing better than they otherwise would. (But sometimes, of course, worse — what Brian Skyrms has called “the dark side of commitment” e.g. endless cycles of revenge, etc.)

    One more quick note: I can’t recall who first pointed this out, but I think it is right — one interesting feature is that many people’s decisions re: the Newcomb problem will change if we change the numbers, even in ways that leave the two competing decision-models intact. So even people that claim to be “one-boxers” because it maximizes expected pay-off start getting a little nervous and “two-boxy” if we make the amount in box A $900,000 — even if we have good evidence that the predictor is way way better than 90% accurate! And even people that claim to be “two-boxers” because you should never choose an option that is strongly dominated start rethinking this strategy if we only put $1 in box A — but from the point of view of dominated strategies, taking just B is strongly dominated whenever there is a positive non-zero sum in Box A, no matter how small! Ben’s comments re: a touch of humility seem apropos here. 😉

  7. I think the theoretical answer is wrong. Assuming the aliens are perfectly intelligent, then they will guess what I do correctly.

    So if I choose both, that will be what they predicted, and I’ll have 1000. If I choose just the opaque box, that too was predicted, and I get 1e6.

    Surely the prediction and intelligence makes the choice into one between both (given they knew that) or one (given they knew that).

    I don’t know how to write that mathematically…

  8. Math teacher wants us to remember all of the formulas for geometry.
    Posted April 3, 2015

    What???
    Teachers wants us to remember all of the formulas for geometry.
    So is this impossible? Not really. So is there a way to accomplish this? I think so 🙂
    How to remember the formulas is:
    -you can write it on the mirror in the bathroom
    -make flashcards
    -make a song to help to remember
    -write it on the paper so many times to remember it
    – if you have quizlet then you can use it to help you on math formulas or definitions too

    Thank you:

    NOW YOU CAN TRY THIS WHEN YOU HAVE TROUBLE ON THE GEOMETRY TEST BECAUSE OF THE FORMULAS!!!

  9. What good is $1000, you are on an alien spacecraft orbiting some distant planet? What if do not take a box? How good is their prediction then?

  10. How about this reasoning: “Oh God! I’ve been kidnapped by aliens who want to run Psychology experiments on me! They have indicated that they think taking both boxes is greedy. How about I just take the one box so they’ll think I’m a better person and hopefully let me live?”

  11. Unless they’re asking me to you should pick a box to keep, I would prefer none, I wouldn’t like someone else’s property in my possession esp if it’s money that’s not easy to return or for my aspect, keep.

  12. I would probably sit in the corner for an hour, wondering what the aliens predicted and what I should do and how they could of done it.
    And then I start wondering what would happen if I only took the clear box.
    Then I would take only the clear box, because (in theory) i would rather know what would happen than have a ton of money.
    Besides, $1000 is a lot of money anyway, and I have no idea what I’d do with $1,000,000,000.
    How much money would they give me then?
    What would happen???

Leave a Reply to EllaCancel reply