There are two boxes: A and B. You can either take only box B, or take both boxes A and B. | Box A is transparent and contains $1k. Box B is opaque and contains either $0 or $1mln | Here’s the twist: a reliable predictor (99.9% accurate) has already predicted what you’ll choose. If they predicted you’d take only B, they put $1mln in it. If they predicted you’d take both A and B, they left B empty. | What do you choose?


Newcomblike problems are the norm

Me

If you are shy and go to a job interview when u know they more likely hire confident people, but don’t hire fake people what do you do?

Scott Alexander on it

I disagree both with Eliezer and Scott takes on it. One boxing is not irrational. The entire newcomb’s paradox is an impossible scenario bc there are no perfect predictions. If they were the rational way to do would be to take one box.

The idea has a history behind it. Newcomb’s Paradox is a weird philosophical problem where (long story short) if you follow an irrational-seeming strategy you’ll consistently make $1 million, but if you follow what seem like rational rules you’ll consistently only get a token amount. Philosophers are divided about what to do in this situation, but (at least in Yudkowsky’s understanding) some of them say things like “well, it’s important to be rational, so you should do it even if you lose the money”. This is what Eliezer’s arguing against. If the “rules of rationality” say you need to do something that makes you lose money for no reason, they weren’t the real rules. The real rules are the ones that leave you rich and happy and successful and make the world a better place. If someone whines “yeah, following these rules makes me poor and sad and unable to help others, but at least they earn me the title of ‘rational person’”, stop letting them use the title!