# Mind projection fallacy

The “mind pro­jec­tion fal­lacy” oc­curs when some­body ex­pects an overly di­rect re­sem­blance be­tween the in­tu­itive lan­guage of the mind, and the lan­guage of phys­i­cal re­al­ity.

Con­sider the map and ter­ri­tory metaphor, in which the world is a like a ter­ri­tory and your men­tal model of the world is like a map of that ter­ri­tory. In this metaphor, the mind pro­jec­tion fal­lacy is analo­gous to think­ing that the ter­ri­tory can be folded up and put into your pocket.

As an archety­pal ex­am­ple: Sup­pose you flip a coin, slap it against your wrist, and don’t yet look at it. Does it make sense to say that the prob­a­bil­ity of the coin be­ing heads is 50%? How can this be true, when the coin it­self is already ei­ther definitely heads or definitely tails?

One who says “the coin is fun­da­men­tally un­cer­tain; it is a fea­ture of the coin that it is always 50% likely to be heads” com­mits the mind pro­jec­tion fal­lacy. Uncer­tainty is in the mind, not in re­al­ity. If you’re ig­no­rant about a coin, that’s not a fact about the coin, it’s a fact about you. It makes sense that your brain, the map, has an in­ter­nal mea­sure of how it’s more or less sure of some­thing. But that doesn’t mean the coin it­self has to con­tain a cor­re­spond­ing quan­tity of in­creased or de­creased sure­ness; it is just heads or tails.

The on­tol­ogy of a sys­tem is the el­e­men­tary or ba­sic com­po­nents of that sys­tem. The on­tol­ogy of your model of the world may in­clude in­tu­itive mea­sures of un­cer­tainty that it can use to rep­re­sent the state of the coin, used as prim­i­tives like float­ing-point num­bers are prim­i­tive in com­put­ers. The mind pro­jec­tion fal­lacy oc­curs when­ever some­one rea­sons as if the ter­ri­tory, the phys­i­cal uni­verse and its laws, must have the same sort of on­tol­ogy as the map, our mod­els of re­al­ity.